Prison Tech Comes Home

Landlords’, bosses’ and schools’ intrusion of surveillance technologies into the home extends the carceral state into domestic space.

Throughout the pandemic, new surveillance systems—used by landlords, educational institutions, and employers—have converged, capturing new forms of data and exerting new forms of control in domestic spaces. COVID-19 prompted bosses and schools to accelerate the deployment of surveillance and tracking systems. As the pandemic drags on, many are still monitoring and assessing remote learners and workers in their most intimate environments. Landlords, meanwhile, took the pandemic as a time to promise “touchless” convenience and increased control over the homes of their tenants, rushing to install tracking systems in renters’ homes. Whatever the marketing promises, ultimately landlords’, bosses’, and schools’ intrusion of surveillance technologies into the home extends the carceral state into domestic spaces. In doing so, it also reveals the mutability of surveillance and assessment technologies, and the way the same systems can play many roles, while ultimately serving the powerful.

Abolitionist organizers and scholars have long demonstrated the ways in which the carceral state exists well beyond prisons, jails, and police. As Michelle Alexander reminds us in her foreword to the excellent book Prison by Any Other Name, “‘Mass incarceration’ should be understood to encompass all versions of racial and social control wherever they can be found, including prisons, jails, schools, forced ‘treatment’ centers, and immigrant detention centers, as well as homes and neighborhoods converted to digital prisons.”1

This abolitionist frame helps us understand the means by which surveillance and tracking technologies serve to transform renters’, students’, and workers’ homes into a form of “digital prison,” subjecting domestic spaces to control, punishment, and persistent intrusion. While we in no way wish to collapse the meaningful differences between the brutal caging of human beings in jails and prisons with comfortable-but-surveilled living spaces, we believe that examining the continuum between the two is imperative for understanding how the carceral state leverages computational technologies of surveillance and control to extend beyond the punishing walls of the prison. This allows us to better apply abolitionist frameworks in the process of mapping these new and often subtle manifestations of the carceral state, especially within the racist housing landscape marked by unprecedented renter debt and an impending wave of evictions. It also enables us to consider alternative futures, encouraging us to imagine a world not built on punishment, but instead on systems of mutual aid and caring justice.


Landlord Tech

In the wake of the 2007–10 subprime mortgage crisis, the real estate industry began investing heavily in what it calls “proptech,” or property technology. This industry-crafted market, which is not always visible to renters or the public, includes the platforms, software, hardware, and data-collection systems used by speculators, landlords, and developers to increase control over tenants, as well as to extract more profit from real estate.

Following many discussions with tenants and housing-justice organizers, we made a conscious decision to use the term “landlord tech” instead of “proptech.” In making this choice, we sought to more clearly signal whose interests are served in deploying surveillance and tracking in domestic spaces, and to pick a term more descriptive to tenants unexposed to insider industry jargon.

While the investment in landlord tech spiked after 2008, this is not where the intersection of residential surveillance and carceral tech begins. Prior to contemporary forms of high-tech tenant screening, lower-tech assessment systems had long been used to exclude people with criminal records and to make finding shelter difficult (if not impossible) for those recently decarcerated. Tenant screening emerged in the 1970s, when landlords gained access to eviction records, allowing them to blacklist tenants who’d been previously displaced and to prevent them from securing new housing. It was also during this time that landlords began using CCTV cameras to monitor building common spaces, a racialized practice that has dramatically expanded in contexts of public, low-income, and affordable-housing complexes.

The landlord-tech industry grew in the wake of 9/11, particularly in the realm of tenant screening, and ballooned following the subprime crisis, which also saw a boom in corporate landlordism more broadly. Wall Street investment firms such as Blackstone swallowed up huge swaths of foreclosed single-family homes, which saw Black people and people of color bearing the costs of risky subprime financial instruments, while the institutions that created these received bailouts, and, in the case of companies like Blackstone, richly benefitted from this racist displacement. With consolidated ownership came the need for new systems to manage thousands of properties, remotely and at scale. Landlord tech provided the perfect solution for corporate landlords, offering online and app-based systems for rental payment, utility monitoring, building access, and eviction automation. In the realm of public and low-income housing, landlords even began adding biometric facial-recognition cameras to already invasive CCTV monitoring systems—many of which have direct lines to law enforcement.

As of March 2021, over 10 million renters couldn’t pay their rent due to COVID-19-related hardships; by June 2021, 5.8 million renters (14 percent of all renter households) were behind on rental payments, which added up to $20 billion national rental debt, according to the National Equity Atlas. Meanwhile, corporate landlords, including Blackstone, are following their familiar disaster-capitalist playbook to amass huge profits and develop new markets, for instance in commercial real estate. And landlord-tech companies are aiding them in this pursuit, extending corporate landlords’ reach, with catastrophic examples near and far. Meanwhile, the threat of displacement looms large, with an expected flood of evictions to come when already-weak eviction moratoria expire throughout 2021. As was the case during the subprime crisis, evictions are likely to fall along familiar race, gender, and class lines.

Surveillance and tracking technologies transform homes into a form of “digital prison,” subjecting domestic spaces to control, punishment, and persistent intrusion.

Examining the particularities of landlord-tech companies can help us better understand these dynamics. For example, US tenant-screening today is dominated by companies such as AppFolio, CoreLogic, RealPage, On-Site, and SmartMove. These combine tenant-eviction records, credit checks from companies like TransUnion and Experian, and criminal-record checks. And because decarcerated residents often lack the privilege of good credit histories and records of recent tenancy, they are doubly discriminated against in the rental market. Moreover, landlord tech often prevents those freed from prison from securing housing, using their records of criminalization as an excuse to deny them shelter. This compounds a well-documented dynamic in the realm of surveillance cameras, facial-recognition systems, and neighborhood-surveillance apps such as Nextdoor, in which these forms of landlord tech contribute to the criminalization and harassment of Black, brown, and poor people on the one end, while tenant screening contributes to the marginalization of already-criminalized people on the other.

While tenants have been struggling and organizing against evictions and unpayable rents, landlord-tech companies have seen the pandemic as a business opportunity. Starting in the early days of the pandemic, these companies began marketing thermal cameras, CCTV systems, neighborhood-surveillance apps, and “smart” property-management platforms as tools for safety and health, often pivoting from earlier marketing. In so doing, they made a variety of bold and often unsubstantiated claims. The landlord-tech company Virtual Doorman, for instance, suggests that touchless technologies (like CCTV, video intercoms, and remote access control) allow buildings to follow COVID-19 reopening and social-distancing policies while remaining operational. Amazon’s new Alexa for Residential followed a similar script, pointing out that Alexa requires zero setup for new tenants. And the access-control company StoneLock advertised their “frictionless” recognition solution: a touch-free way to ensure property is only accessed by authorized people. The specifics of how—or whether—these systems work to allay COVID-19 have been left out.

Other surveillance companies, like ClearView, have even promoted outlandish “COVID-19 detection” features, like their temperature-check thermal camera system TCB601, which allegedly detects people with high body temperatures. A similar product from TotalSecurity, the elevated body-temperature camera, advertised itself (to bosses) as a means for allowing people to “safely” return to work. The company Bioconnect has even promised that their technology can help landlords distinguish people infected with COVID-19 from those who aren’t. As with many claims made by tech companies, these and similar “COVID-19 detection systems” aren’t backed by evidence. In fact, they have been proven ineffective. Yet this has not stopped exuberant and inaccurate marketing.

Meanwhile, companies such as Naborly added new features to their tenant-screening software, offering to track tenants unable to pay rent during and after COVID-19. Naborly’s private screening bureau aggregates data into reports of “delinquent tenants,” which then get sold back to landlords. This service allows landlords to effectively blacklist people experiencing financial hardship and prevent them from securing future shelter.

Prepandemic, many of these same systems were marketed as tools to sort “bad” people from “good.” While not explicit, in the context of a culture that disproportionately criminalizes Black and brown people, this is an already-racist framework, and it served to subject Black and brown people, and the unhoused, to harassment and harm. The almost-seamless transition from “bad people” to “bad viruses” (and, implicitly, those who carry the viruses) highlights the categorical framing that companies maintain. It also shows how technologies that work to classify and sort people are malleable in terms of their particular application, while remaining consistent in terms of the ideology they encode and perpetuate.

It may seem startling that companies can continue to make wildly inaccurate claims about their products in the face of countervailing evidence. This is in part because landlord tech, like most consumer technology, is produced by the private sector. As such, it is often shielded from public scrutiny by corporate and trade secrecy. Indeed, most of these companies’ primary customers are landlords, and yet they frequently mask this fact, marketing their products and services as “smart” amenities for renters.

The company ADT, for instance, vacuously suggested their systems “give renters more ownership” over their rental, even though they provide monitoring data to landlords, not tenants, and would ultimately be purchased and installed by landlords, without consent or even involvement from renters. Citing data from surveys conducted by industry associations like the National Multifamily Housing Council and the National Apartment Association, websites for both Amazon Alexa for Residential and ADT Smart Communities allege that a majority of renters not only want this smart technology in their apartments but are also willing to pay extra for it.

Landlord-tech companies capitalize on the imaginary of neighborhood security through smart cities, often advertising their neighborly attributes. A commercial for ADT, for instance, has displayed multiple uses for their smart tech in the home and for social distancing, emphasizing that “no one has more ways to help keep what matters most safe.” Their resource material on “Being a Good Neighbor in the Face of a Pandemic (COVID-19)” encourages consumers to use neighborhood snitching apps like Nextdoor and Citizen to monitor delivery or other service workers. Amazon’s Neighbors by Ring app, meanwhile, conflates “staying connected” and “supporting neighbors” with police partnerships. What this means in practice is that video footage recorded by Amazon Ring is shared by Amazon with local police.

Even before the pandemic, neighborhood-policing apps have been shown to exacerbate racist surveillance. These apps encourage residents to surveil those perceived as “out of place” or “disruptive.” And while Amazon’s June 2020 press release promised to pause the sale of their facial-recognition products to police after the murder of George Floyd, this “woke” PR signaling didn’t extend beyond this discrete promise. Within a month of Floyd’s murder, Amazon Ring signed 29 new police partnerships; today, there are over 2,000 in the US alone. In the pandemic year 2020, Ring supplied over 20,000 police requests with data.


School and Work

Landlord tech isn’t the only surveillance industry that took advantage of COVID-19 to increase its market reach. As schooling and work moved into homes, domestic surveillance followed. Bosses and educational institutions have begun using similar systems, demanding access to private lives and environments in the name of worker and student control. And while pandemic restrictions are relaxing in many areas of the US, the use of these systems does not appear to be doing the same.

Revealing the mutability of domestic-surveillance technologies, a blog post on Ring’s website offered instructions on “How to Help Kids Be Safe, Engaged and Connected This School Year,” displaying a teacher using Ring to leave their students a message. This same Ring system used by police and landlords thereby morphs, through marketing alone, to appeal to educational institutions. But however diverse the markets targeted, the anti-Black racism baked into these systems remains.

So-called e-proctoring surveillance systems have also proliferated alongside online schooling. Companies like Proctortrack and Honorlock market themselves to universities as arbiters of academic integrity. Some of these systems, already proven faulty, are equipped with AI-enabled features that track “abnormal eye movements” to detect cheating. Yet Black students are disproportionately misrecognized by these features due to their racist labeling systems, and are thus put at risk for being erroneously labeled as cheating. Indeed, these flawed proctoring systems have already resulted in students facing penalties. For example, in recent months Dartmouth’s Geisel School of Medicine used data from e-proctoring services and course-management systems to bring charges of academic dishonesty against multiple students, and has rolled out strict social media policies in response to students speaking out against e-proctoring.

Beyond proctoring software, the requirement that students use Zoom and Google Classroom to attend class raises other concerns. Student protesters and civil rights groups flagged privacy issues, noting that the institutions (in this case, schools) who license Zoom and Google Classroom can record and monitor interactions over the software, which may include intimate and messy home life. Zoom has even taken the unilateral action of canceling accounts or events hosted on the platform that the company didn’t agree with politically, providing a tangible example of the perils of leaving our shared intellectual life in the hands of barely accountable tech companies.

Schools and teachers have proceeded to employ these surveillance systems to enforce standards for dress and behavior, normalizing remote discipline and social control in the home. And they have done so while punishing poor students, and those whose domestic environments don’t live up to normative ideas about the middle-class nuclear family: where every child has a strong internet connection, their own bedroom and laptop, and no duties of care.

The racist and classist consequences of this kind of monitoring and remote discipline are already apparent. A school called the police on a Black student for playing with a toy gun while “attending class” via Zoom. Another Black child was removed from her family and sent to juvenile detention simply for failing to complete her homework. Further, there are multiple reports of Black students being unable to access exams because their faces are not recognized by e-proctoring software. These students describe needing to shine bright lights on their faces in order to be recognized, which places extra stress and burden on students of color compared to their white classmates.

The closing web of surveillance systems in domestic spaces will only exacerbate gendered and racialized inequality.

As with schooling, so with work. Frontline workers in particular have encountered increased surveillance and monitoring, rationalized through what scholar and activist Naomi Klein describes as “coronavirus capitalism.”

Amazon warehouse workers have been monitored through a social-distancing AI-powered ‘Distance Assistant,’” even as union-tracking software was being deployed to thwart organizing efforts. Amazon also announced plans to install AI cameras to surveil delivery drivers from multiple invasive angles to detect distracted driving. Drivers are also often surveilled by Amazon’s Ring and other landlord-tech home monitoring when they drive through unfamiliar areas delivering packages.

For those working from home, tracking and automated assessment have also surged. Work-issued laptops and mobile devices now host an increasing number of surveillance and assessment systems to monitor workers. Call-center workers laboring for large tech companies report that their contracts force them to agree to in-home “monitoring by AI-powered cameras,” which includes capturing surveillance footage from “the worker’s family members, including minors.”

Some of these systems provide worker “productivity scores,” measuring worth and output via opaque metrics, often based on proxies such as time spent on a given site or typing speed. Platforms that use eye tracking, motion detection, and online-activity surveillance contribute to creating rigid and punitive work expectations. Such demands force workers to choose between, for example, assisting a child or a loved one or performing “attentiveness” and “productivity” for an inflexible algorithm.

This inflexibility, combined with the toll of gendered domestic labor, has reproduced white heteromasculine dominance in the workforce, which disproportionately harms Black and Latinx women. And because Black women have historically also been most targeted by evictions due to inability to pay unpayable debt, the closing web of these systems in domestic spaces—landlord tech, school surveillance, and worker monitoring—will only exacerbate gendered and racialized inequality.


Making a Home a Prison

COVID-19 moved school and work into the home, and companies looking to profit from disaster worked to market domestic-surveillance tech as “necessary” to accomplish this shift. In doing so, they worked to integrate carceral logics of surveillance and social control into everyday domestic life and labor. They brought the prison into the home, in some ways quite literally: while home-, work-, schooling-, and prison-surveillance systems are often packaged as distinct, we see that they in fact are often comprised of similar technical capabilities packaged differently. Many companies that build these surveillance systems sell the same (or very similar) systems as “solutions” to vastly different problems, depending on the market they’re targeting.

For example, prison-tech companies like Sierra Wireless, which sells the technology for electronic-ankle-monitoring devices that literally convert homes into digital prisons, also market their similar systems as vital components of so-called smart cities, health care, and retail environments. Sierra Wireless is not alone in marketing their prison-surveillance technologies across multiple sectors—GTL boasts the application of their technologies across corrections, government, and education sectors, while Laipac sells their ankle-monitoring devices for everything from home-offender monitoring to home-quarantine infectious-disease control. And as we have seen with Ring, a “one size fits many” approach to surveillance points to this convergence and prefigures what a unified surveillance system reporting to schools, landlords, bosses, and police might look like.

Meanwhile, conditions in prisons, jails, and immigrant-detention centers, which were already torturous, have only gotten worse during the pandemic. Prisoners are denied safe living conditions while being forced into contact with the virus and punished for exhibiting symptoms. People have been subject to intensified isolation in modes defined as torture, losing access to in-person visits, library access, higher-education programs, and more. Companies supplying prison-telecommunication platforms and hardware for increased video-call utilization are meanwhile amassing major profits while remaining unaffordable for most prisoners. These companies have even begun to monitor calls made by prisoners for “keywords” regarding COVID-19 under the auspices of virus mitigation, raising concerns about new surveillance precedents and convergences with other home-based video-monitoring platforms.



To address the many harms that attend the encroachment of surveillance tech into the home, the work of researching and resisting these systems must be informed by the ongoing work of prison abolition. This means that research needs to begin by recognizing the prison-industrial complex, which is based upon a long and enduring history of racial capitalism, as key to the inequities and injustices that such systems amplify and extend. Researchers and organizers must center these connections, treating questions of how and in what way such systems will serve to criminalize those already targeted through racist surveillance regimes as a central concern, not an afterthought to be bucketed with questions of bias or fairness. Such attention to questions of criminalization and harm can also help move this work beyond critique and “naming harms” to envisioning worlds otherwise. As abolitionist organizer and author Mariame Kaba notes, abolitionist frameworks and practices must go beyond envisioning a society without the prison-industrial complex, and take on the task of creating a world that prioritizes a dignified, comfortable, and safe life for all.

We are seeing abolitionist knowledge production and organizing focus on technologies of surveillance and social control. For instance, in the UK, abolitionist groups have demanded the full release of those incarcerated in detention centers, including those of Medway Secure Training Centre, slated to become England’s first “secure school” for youth prisoners. Similarly, in the US, a growing movement is working to not only defund and abolish the police but to remove police from school campuses. This movement complements, and in some cases overlaps with, student resistance against the use of academic-surveillance technologies in particular. Student organizers have launched petitions demanding administrators discontinue university use of proctoring tech, successfully pressuring schools to divest from contracts with Proctortrack and Proctorio. Parents concerned about the increased surveillance of their children due to e-proctoring software created the site, which shares testimonials about their experiences with resources for activism. Activist group Fight for the Future also recently published their “Ban EProctoring Scorecard” in order to track different university policies regarding e-proctoring software.

Housing-justice organizing groups have also embraced abolitionist approaches, with translocal campaigns calling to cancel rent and abolish evictions and private property, in addition to embracing mutual-aid frameworks, organizing rent strikes and eviction moratoria, and fighting the violent connection between police and eviction. Groups such as the Anti-Eviction Mapping Project,, SAJE, and the Chicago DSA, among others, have also created an array of resources, mapping corporate-landlord ownership portfolios (which are often vast, and cleverly hidden within complex business architectures). This provides essential information that aids multi-building organizing campaigns, and helps tenants understand whom to hold accountable—something that is especially important when interaction with a landlord is mediated by an impersonal, and unnamed, landlord-tech service.


Facial Recognition Is Only the Beginning

By Jake Goldenfein

We can also take valuable lessons from the work of the Atlantic Plaza Tower Tenants Association, which successfully thwarted the deployment of a biometric heat-mapping facial-recognition system sold by the company StoneLock. This technology was installed under the pretext of allowing the landlord to monitor building access, but tenants feared it would be used for increased racial profiling, racist evictions, and gentrification. After all, prior to their landlord attempting to install StoneLock, tenants had already been targeted by the landlord’s CCTV cameras for tenant organizing, and for petty lease violations. Based on this experience, they knew that advanced biometric technology would only exacerbate what resident and tenant organizer Tranae’ Moran described to us in an interview as a system of “digital slavery.” Moran worried that new surveillance would turn her home, which is already filled with CCTV cameras, into a juvenile-detention center for her son. Alongside other tenant organizers, Moran successfully fought back against StoneLock through a multipronged strategy of direct action, media campaigning, and alliance building. This organizing work was part of the inspiration for Landlord Tech Watch, which we are part of, and which offers tools and resources for understanding landlord tech, as well as a guide produced by the Ocean Hill–Brownsville Tenant Alliance on how to fight back.

Ananya Roy, in writing of the abolitionist project of “undoing property,” suggests that “abolition of property and the abolition of police go hand in hand.” Indeed, policing, as Simone Browne reveals in Dark Matters: On the Surveillance of Blackness (2015), has long been a technique of white ownership, constituting what Cheryl Harris describes as “whiteness as property.” Bosses, landlords, prison companies, and tech companies alike uphold this racial-capitalist and settler-colonial structure, one exacerbated through COVID-19 capitalism. Divestment from whiteness, as Roy puts it, means divestment from this amalgamated structure alongside abolitionist organizing. Part of this work involves forging and supporting solidarity work among tenants, workers, and those incarcerated in the ongoing fight for freedom. Part of it also involves mapping the historic and updated geographies of carcerality to better plan the ongoing work of abolition. As Charmaine Chua reminds us, abolition is a time during which “what seemed impossible has suddenly become possible.”


This article was commissioned by Mona Sloane. icon

  1. Michelle Alexander, foreword to Prison by Any Other Name: The Harmful Consequences of Popular Reforms, by Maya Schenwar and Victoria Law (New Press, 2020), p. xiii.
Featured-image photograph by Gemma Evans / Unsplash