Hi, I'm James Wilson, CEO of MyDataRemoval. We fight for privacy by spreading awareness of best cybersecurity practices and by removing personal information from hundreds of data brokers. Yes, the automatic facial recognition of Ring doorbells is a privacy concern. First, it collects biometric data and scans faces without formal consent. We all have the right to know that our data is being collected and to opt out of data collection. Additionally, facial recognition does not have a hundred percent accuracy. This is due to factors like image quality, poor lighting, variety in angles, and algorithmic biases. So, aside from the lack of consent for data collection, there are risks of misidentifications that could lead to wrongful arrests, which is really concerning. For these reasons, Ring doorbell owners should consider not using this feature. And one best practice is using generic labels, like "Neighbor No. 1," instead of real names. This helps limit what data is exposed. Other best practices we recommend are: Regular audits or deletion of old profiles Placing the camera about 4 ft high to avoid capturing any passerby Make sure to ask whether or not your neighbor is okay with their biometric data being collected. That's all. Best regards, James Wilson
Tech & Innovation Expert, Media Personality, Author & Keynote Speaker at Ariel Coro
Answered 3 months ago
I've spent years warning Spanish-speaking audiences about smart home security risks on Despierta America and CNN en Espanol, and Ring's facial recognition crosses a line I've been concerned about for a while. The FBI already warned about hackers taking control of insecure TVs and cameras--now we're voluntarily building facial databases that could be compromised or misused. Here's what bothers me most: Ring's parent company Amazon has a history of sharing doorbell footage with law enforcement, sometimes without warrants. When you enable facial recognition, you're not just scanning your invited guests--you're cataloging every person who walks past your property, creating a surveillance network your neighbors never consented to. This is different from motion detection or basic video recording. My recommendation from covering tech security for two decades: disable facial recognition entirely unless you're dealing with a legitimate stalking or harassment situation that law enforcement is actively investigating. Standard motion alerts and video recording already provide excellent security without building a biometric database. I tell my audience the same thing I said about smart speakers--just because the feature exists doesn't mean you need to use it. **Bio:** Ariel Coro is the on-air tech expert for Despierta America (Univision's #1 morning show), regular CNN en Espanol contributor, and author of "El Salto." He's been educating millions of Latinos about technology security and privacy for over 15 years across TV, radio, and digital platforms.
I've consulted with businesses across New Jersey on data privacy compliance, and Ring's facial recognition puts homeowners in the exact same legal trap we warn companies about. White Castle lawsuit I mentioned in my work? They faced $17 billion in damages just for using employee fingerprint logins without proper consent in Illinois. Your Ring doorbell is now collecting the same type of biometric data from every person who walks past your house--neighbors, mail carriers, kids walking to school--none of whom consented. The real issue isn't just privacy violation; it's liability exposure. When Facebook got hit with that $725 million settlement for sharing user data without consent, it was only 0.62% of their revenue--pocket change. But for a homeowner? One lawsuit from a neighbor whose biometric data you've been collecting could be financially devastating. We're already seeing this with doorbell cameras being subpoenaed in disputes, and facial recognition adds a whole new dimension of legal risk. My recommendation is simple: disable facial recognition entirely. The feature doesn't meaningfully improve your security--you can still see who's at your door and get motion alerts without creating a biometric database. Check your Ring privacy settings under "Video Settings" and turn off any facial recognition or identification features. You're essentially running unauthorized surveillance on everyone within camera range, and the legal landscape is moving fast against this type of data collection. **Bio:** Paul Nebb is founder of Titan Technologies and a cybersecurity expert who has advised businesses from Fortune 500 companies to small enterprises on data privacy and compliance. He's presented at West Point Military Academy, the Harvard Club, and Nasdaq on cybersecurity risks and regulatory compliance.
I'm not a privacy expert, but I spent nearly a decade in aerospace and defense where we dealt with extremely strict biometric security protocols. At companies like Kratos Defense and Meta Special Aerospace, facial recognition systems required explicit consent documentation, regular audits, and air-gapped storage--none of which consumer devices like Ring provide. Here's what bothers me from an engineering standpoint: Ring's facial recognition creates a persistent database with zero structural redundancy or fail-safes. In aerospace, if we collected biometric data, we had multiple authentication layers and the data never left secure networks. Ring stores your face in the cloud where it can be accessed by law enforcement without a warrant in some cases, shared with third parties per their terms of service, and potentially breached like any other cloud database. The real issue isn't just privacy--it's lack of engineering discipline in how the data is protected. I'd disable this feature entirely unless Ring implements military-grade encryption standards and gives users full control over deletion. Your face shouldn't be treated with less security than the airplane parts I used to design. **Bio:** Jose Grados is owner of A Better Fence Construction and former aerospace engineer with nearly a decade at defense contractors including Kratos Defense, United Dynamics, and Meta Special Aerospace, where he managed precision-critical projects requiring strict security compliance and biometric access protocols.
I run a men's health clinic where patient privacy isn't just important--it's legally mandated under HIPAA. We handle incredibly sensitive data: testosterone levels, sexual dysfunction histories, STI results. If that information leaked, it could destroy careers, marriages, and reputations. Ring's facial recognition operates with *none* of those protections. Here's what concerns me from a healthcare compliance perspective: In my practice, every piece of biometric data requires documented consent, specific retention policies, and immediate patient access to deletion. We can't share anything without explicit authorization. Ring's terms let them keep your facial data indefinitely and share it broadly. That's not consent--that's a contract of adhesion most people never read. The medical standard is "minimum necessary"--collect only what you absolutely need, protect it rigorously, and destroy it when the purpose ends. Ring fails all three tests. Your doorbell doesn't *need* to identify faces to function; it just needs motion detection. The facial recognition is a business feature dressed up as a security one, and you're the product. I'd disable it immediately. In 17 years of handling sensitive patient data, I've learned that convenience features always become liability nightmares. Your face is more permanent than your credit card number--you can't just get a new one issued when there's a breach. **Bio:** Leonard A. Berkowitz, PA-C, MS is a nationally certified physician assistant and co-founder of the Center for Men's Health Rhode Island. Licensed since 2008 with an active National Provider Identifier, he manages HIPAA-compliant systems handling sensitive biometric and medical data for hundreds of patients annually.
I'm a physician who runs a longevity and hormone optimization clinic in Florida, and while I'm not a privacy lawyer, I deal with something arguably more sensitive than faces--patient biometric data tied to sexual health, hormone levels, and medical conditions people would never want exposed. Here's what strikes me about Ring's approach. The consent model is broken. In my telemedicine practice, before I can even discuss testosterone therapy or peptide treatments with a patient, I need documented confirmation they're physically in Florida, they understand what data I'm collecting, and they explicitly agree to it. Ring is collecting facial biometrics of people who never opened an app, never clicked "I agree," and might not even know that doorbell is scanning them. Your mail carrier didn't consent to be in someone's biometric database. What worries me most is the lack of medical-grade data protection. Under HIPAA, if I mishandle patient information--even accidentally--I face serious penalties including potential loss of my medical license per Florida Statute 456.072. Ring's terms of service don't carry that same weight. If their database gets breached or subpoenaed, those faces are out there forever with none of the legal protections that govern actual sensitive data. My take: disable it unless you have a documented security threat that justifies scanning everyone's face without permission. I wouldn't prescribe testosterone just because the technology exists--I prescribe it when medically necessary. Same principle applies here. Just because you can collect biometric data doesn't mean you should.
I'm Ralph Harris, owner of Salvation Repair in Mississippi with 20+ years in electronics repair and over 500 Apple certifications. I've physically opened thousands of devices and seen exactly what hardware manufacturers build into products--and more importantly, what they *don't* tell consumers about. Here's what concerns me from a hardware perspective: Ring doorbells store facial recognition data locally on the device before uploading to cloud servers. When devices come into my shop for repair, I can often recover "deleted" data that owners assumed was gone. If someone steals your Ring doorbell off your porch (happens constantly), they potentially have physical access to facial biometrics of everyone who's walked past your house. You can't factory reset a device that's already in someone else's hands. The bigger issue is repairability and data security overlap. Ring actively fights Right to Repair legislation, which means independent techs like me can't access proper service manuals or diagnostic tools to help customers truly secure their devices. When manufacturers lock down repair access, they also lock you out of understanding what data your device actually stores locally. I've seen this exact pattern with Apple's serialization practices--they want full control over the device lifecycle, which means you never truly own your hardware or the data on it. My recommendation: if you're keeping Ring, physically cover the camera when you're home and create a separate guest network for IoT devices that's isolated from your main network. Better yet, consider cameras that store 100% locally on SD cards you physically control--they exist, they're just not pushed by big tech because there's no subscription revenue.
I run a device repair shop in Albuquerque, and I handle data recovery and micro-soldering work daily--which means I see exactly what's stored on people's devices when something breaks. Here's what most people don't realize: your doorbell is collecting way more than just faces. When someone brings in a phone or tablet synced to their Ring account and I'm recovering their data, I can see their entire ecosystem--timestamps, motion alerts, tagged faces, and every piece of metadata Ring collects. That data doesn't live in isolation. It syncs across devices, gets stored in cloud accounts, and becomes part of a bigger profile. If any one device in that chain gets compromised, breached, or even just improperly wiped before resale, all of it is accessible. I had a customer who sold an old iPad without properly clearing it. The buyer could still access their Ring footage and facial recognition tags because the account sync hadn't been disabled. They only found out when the new owner contacted them. That's not a Ring-specific flaw--it's how interconnected ecosystems work--but it shows how biometric data doesn't stay locked to one doorbell. It travels. From a repair-side perspective, my recommendation is this: if you're keeping facial recognition enabled, treat every device connected to that account like it holds your house keys. Log out of accounts before repairs, use strong unique passwords, enable two-factor authentication, and understand that convenience always trades off with exposure. That's not fearmongering--it's just how the tech actually works. **Bio:** Cyndi Anastasio is owner of The Phone Fix Place in Albuquerque and a former Intel engineer with nearly 14 years of experience. She specializes in micro-soldering, data recovery, and circuit board diagnostics, helping customers understand how their devices store, sync, and expose personal data during repairs.
I'm David Symons, Managing Director at DASH Symons Group. We've installed and maintained over 300+ camera systems across licensed clubs, high-rise buildings, and gated communities in Queensland since 2008, including facial recognition systems for venues that need them for security and compliance reasons. The question most Ring owners aren't asking is what happens to your guests and delivery drivers who never consented to being in a facial recognition database. In our commercial installations, we're legally required to post signage and obtain consent before capturing biometric data. Your Ring doorbell doesn't do this for the plumber, your kid's friends, or the person walking their dog past your house. That's the real consent gap nobody's talking about. Here's what I've learned from deploying facial recognition in controlled environments: it only makes sense when you control the entire data pipeline and have a legitimate security need. In a licensed club with 30+ access-controlled doors, we use it because patrons are notified, there's a genuine safety reason, and the client owns their own server infrastructure. For a doorbell? You're creating biometric records of every person who approaches your home with zero control over that data once it hits Amazon's servers. If you already own a Ring with this feature, my recommendation from a systems perspective is simple: treat it like a standard motion-detection camera and disable facial recognition entirely. You'll still get alerts when someone's at your door, you'll still have video evidence if you need it, but you're not building a biometric database of your entire neighborhood that you don't control and can't protect.
I'm Dr. Maria Chatzou Dunford, CEO of Lifebit where we've built federated data platforms for genomic and clinical data across five continents. I've spent 15+ years dealing with far more sensitive biometric data than faces--genetic sequences that reveal disease risk, ancestry, and biological identity. Here's what that experience taught me about biometric collection. The consent issue is real, but it's worse than most privacy advocates realize. When we analyzed voice recognition systems for clinical trials, we finded voice recordings contain biometric identifiers that needed automatic deidentification--even though participants explicitly enrolled in studies. Ring's facial recognition captures your neighbors and visitors who never consented to anything. In our work under GDPR, that's called processing third-party data without legal basis, and it's a showstopper. What concerns me most is the permanence problem combined with federated risk. In precision medicine, we use K-anonymity requirements (minimum 10 individuals per data point) specifically because you can't change your biometrics. Ring creates the opposite--a highly identifiable database of faces linked to physical addresses. If that data federates with other Amazon services or gets subpoenaed, you've lost control forever. My recommendation mirrors what we tell healthcare organizations: if you can't control where the data lives and who accesses it, don't collect it. Disable facial recognition and use motion detection instead. The security benefit of identifying faces at your door doesn't justify creating a permanent biometric record you can never take back.
Hello, I'm Kos Chekanov, CEO of Artkai, a digital product design and development company. We've spent over a decade building customer-facing applications for FinTech and other heavily regulated industries where biometric data handling is a daily concern. Yes, Ring's facial recognition feature raises genuine privacy issues. The EFF is right that scanning faces without affirmative consent violates biometric privacy laws in Illinois, Texas, and several other jurisdictions. But this isn't just a legal problem. It's also a product design failure. Ring built a feature where the device owner becomes the de facto data controller for everyone who walks past their door. Your neighbor didn't consent to having their face scanned and stored in Amazon's system. They probably don't even know it's happening. Ring tries to solve this with an in-app reminder telling users to "comply with relevant laws," but that's passing the buck. Most people buying a doorbell camera aren't going to research biometric consent requirements. From a UX perspective, this creates what I call "forced gatekeeping", where regular consumers are suddenly responsible for navigating complex privacy regulations that even companies get wrong. We see this pattern often when features are designed around technical capability rather than user responsibility. Should Ring owners disable facial recognition? If you live in a state with biometric privacy laws, absolutely. But even beyond legal requirements, ask yourself: do you really need your doorbell to identify people by name, or would motion detection work just fine? The feature's convenience doesn't justify putting your neighbors' biometric data in a database without their knowledge. Best practice: disable it. Never use real names for people who haven't consented in writing. Bio: Kos Chekanov is CEO of Artkai, a global digital product design and development agency specializing in customer-centric applications for regulated industries including financial services. With over 10 years of experience in software development and UX design, Kos works with enterprises to build GDPR-compliant digital products that balance functionality with user privacy. Happy to discuss further if this is helpful. Kos Chekanov CEO, Artkai artkai.io
Hello, I'm Adrian Iorga, Founder & President at Stairhoppers (stairhoppers.com), a Boston-based moving company that's been in business for over twenty years. Running this company taught me that when people invite you into their homes, trust is everything. Ring's facial recognition feature breaks that trust in ways most people haven't thought through yet. In my opinion, yes, it's a privacy violation, but not just for the legal reasons the EFF mentions. It's creating neighborhood surveillance networks without any real oversight. Right now, Ring is not simply "enhancing security". When your doorbell starts cataloging faces of everyone who walks by you're turning residential streets into biometric databases. The consent issue matters because unlike a regular doorbell camera that just records video, facial recognition creates permanent biometric profiles. Once your neighbor's device has mapped your face, that data lives in Amazon's system. You didn't agree to that. Most people walking past a doorbell have no idea it's even happening. Should Ring owners disable this? Absolutely. The feature doesn't make you safer. Motion alerts and regular video already do that job. What facial recognition adds is a false sense of control and real legal exposure. If someone in Illinois or Washington walks past your door and later finds out you were scanning faces without consent, you're potentially liable under state biometric privacy laws. In my opinion, the safest move is to turn it off entirely. Good security doesn't require tracking your neighbors' faces. Bio: Adrian Iorga is the founder and president of Stairhopper Movers, a Boston-based moving and storage company established in 2001. With a degree in economics and nearly 25 years building a customer service-focused business that completes over 20,000 moves annually, Adrian understands the critical importance of customer trust and privacy in service industries. Stairhopper Movers has been named Best of Boston and Best of Massachusetts multiple times and maintains an A+ BBB rating. Best regards, Founder & President at Stairhoppers stairhoppers.com
A few years ago, I installed a smart doorbell at my own home to track package theft. I remember being surprised later when an update introduced facial recognition features I hadn't actively opted into. That moment reframed how I think about consumer privacy. Facial recognition is not just video. It is biometric data, and the consent bar should be much higher. The real issue is not the homeowner, it is everyone else who gets scanned without notice. Groups like the EFF are right to flag this. Ring owners should disable facial recognition where possible, shorten retention windows, and avoid sharing data by default. Convenience should never outrun informed consent. Bio: Albert Richer is the founder of WhatAreTheBest.com, a product comparison platform that evaluates consumer technology with a focus on transparency, risk, and user trust.
In my view, always-on facial recognition on a consumer doorbell camera is at least a high-risk privacy practice—and in some states it can plausibly cross into a legal violation depending on how biometric laws define "collection," "use," and "affirmative consent." The core issue is that a doorbell doesn't just capture the owner's household; it can scan the faces of delivery drivers, neighbors, guests, and passersby who never opted in. That's exactly why groups like the EFF (see https://www.eff.org/deeplinks/2025/11/legal-case-against-rings-face-recognition-feature) argue this kind of feature fails the "affirmative consent" standard many biometric regimes contemplate. Should owners disable it? If you're a law firm partner, attorney, or anyone with a heightened confidentiality/safety profile, I'd lean yes unless you have a clear, narrow use-case (e.g., recognizing immediate family at the door). Even when the feature is "optional," the practical risk is that the most privacy-invasive part happens before anyone can meaningfully consent—because the visitor doesn't control the homeowner's device. Policymakers are already scrutinizing these risks, including concerns about how the feature impacts delivery drivers and other non-users. If owners keep it enabled, "best practices" look like this: --Use the most restrictive mode (limit recognition to a small, defined set like household members; don't label delivery drivers or casual visitors). --Post clear notice at entrances ("Video recording in use; facial recognition may be enabled") so visitors have a meaningful heads-up. --Minimize retention and sharing. That is, keep clips for the shortest practical window, review any sharing defaults, and avoid integrating face labels into other apps/services unless necessary. --Treat it like sensitive data with strong account security (unique password + MFA), least-privilege for shared accounts, and periodic audits of who has access and what faces are stored. My BIO: Nate Nead is the CEO of SEC.co, where he advises SMBs and professionals on corporate security, cyber risk, and privacy-by-design operational controls, including surveillance, identity, and incident response policy governance.
I'm an attorney and I've handled countless privacy cases over the years, and really video surveillance plays a role in todays litigation in almost every case. Car accidents and dash cams. Grocery stores and CCTV. The key issue here is going to be consent. With facial recognition like Ring's, it's not just recording video, it's saving biometric data, too. That creates a whole new level of legal liability with information that sensitive that's being captured out in public essentially. Ring scans every face just to figure out who it knows and who it doesn't. That means your delivery driver, your neighbor, or someone just out walking their dog could have their face processed without ever agreeing to it. From a privacy standpoint, that's a problem. Someone doesn't even have to be on your property to be caught on video. So unless you absolutely need facial recognition, leave that feature off. The main video and motion detection features offer you plenty of protection in the first place. Let's be honest, it feels like we're living in a new Wild West of technology. AI is everywhere now, from facial recognition to drones, even in our washing machines. These tools aren't going away. That's exactly why we need to be especially cautious about the protections we put in place. We can't let innovation move faster than the laws meant to protect us.
If you walk to someone's door, there's a reasonable expectation that the encounter will not be private, however, it gets tricky when your biometric data is being recorded and collected. The UK and Europe err on the side of protecting users more than protecting tech, but we do like our surveillance. Could go either way. Ring will likely have to make doorbells where users can opt out of data collection. But even in that case, a user who opts in is collecting biometric data that does not belong to them. Ring could make a self contained system that does not collect data, but that would likely be far too expensive to be marketable. The only way I could see this going to market, and staying there, is if each person who approaches the door is prompted to opt in or opt out, and that is terribly clunky. Perhaps Ring could start their own network, and only their users opt in? It's going to be interesting to see this one play out. Bio: Arif began his career as a Systems Administrator. After several years of climbing the ladder, Arif made the leap to Technical Director at Just After Midnight to provide the most efficient and effective management services, and support to their clients 24 hours per day, 7 days per week.
Facial recognition in Ring doorbells is highly problematic as every visitor's face is scanned without their consent. Thus, the privacy of every visitor is harmed, and data is sent to Amazon. As any Big Tech, Amazon is interested in using this data for their own purposes, for instance for improving its AI features. However, the people whose data is being used can not be asked for consent as it's not their own doorbell. As representative of Tuta Mail, a privacy-first email service, my expertise lies with data protection and legal compliance. In addition, we at Tuta fight for the user's right to privacy and share EFF's position on the problematic use of Ring doorbell's facial recognition feature 100 per cent. I would highly recommend owners of Ring doorbells to disable the facial recognition feature - not only to protect their visitor's data, but also out of respect for their visitors. It is not okay to share someone else's data with a Big Tech without first asking for consent.
The issue is concrete, not theoretical. Facial recognition turns a simple camera into a biometric system, which introduces a very different set of legal and ethical obligations, especially when consent is implied rather than explicit. In my view, the risk is less about homeowners choosing to use the feature and more about everyone else who did not. The core issue is third party exposure. Individuals who pass through the camera's view did not knowingly agree to biometric data collection, even though many laws require clear, informed consent. A sticker on a door or terms buried in an app do not meet that standard. The moment a system starts identifying faces rather than simply capturing video, the bar rises sharply. I have seen similar systems introduced with good intentions that later created compliance and trust problems. The technology tends to move faster than policy, and companies often frame opt in features as user controlled while overlooking third party exposure. That gap is where violations usually occur. For Ring owners, disabling facial recognition is the safest option if the feature is available. If it is used, owners should apply strict practices. Limit recognition to household members only. Disable data sharing beyond the device. Review retention settings regularly. Make signage explicit and visible. Even with these steps, the legal exposure does not disappear, because the homeowner is not the only subject affected. There is also a broader issue of normalization. Once facial recognition becomes routine in consumer devices, expectations shift quietly. Data collected today can be repurposed tomorrow. I have watched organizations regret adopting biometric systems before standards were clear, because rolling them back is harder than not deploying them in the first place. This does not mean the technology has no place. It means consent, scope, and necessity must be narrow and explicit. Security tools work best when they minimize collateral data collection. When a feature requires explaining why it is safe rather than why it is needed, that is usually a warning sign. The decision is not only about features. It is about responsibility. Biometric data carries weight, and facial recognition raises that burden.
In my view, Ring's integration of facial recognition without explicit, active consent isn't just a technical leap; it's a fundamental breach of the 'Privacy by Design' principles we uphold for our clients. The core issue is 'passive biometric harvesting.' When a visitor approaches a door, they have not affirmed consent to have their unique facial geometry mapped and stored. From an IT leadership perspective, I recommend owners disable this feature. Not only does it flirt with the edges of biometric data collection laws (like the UK GDPR or Illinois' BIPA), but it also creates a high-value database that becomes a prime target for threat actors. Best Practice for Ring Owners: If you choose to use these features, you must treat your home like a professional business—post clear signage indicating that biometric processing is in effect. However, the most secure route for the 'privacy-conscious' is to stick to motion-based alerts and avoid the biometric 'convenience' trap entirely.
From my perspective as someone who regularly covers consumer tech and privacy issues, Ring's move toward facial recognition is understandably controversial. Doorbell cameras don't just capture homeowners — they also record neighbors, delivery workers, and visitors who never actively agreed to have their biometric data analyzed. That's where the concern really starts. Video footage on its own is one thing, but facial recognition adds another layer. Biometric data is persistent and much harder to take back once it exists. Even if Ring positions this feature as optional, most people don't fully understand how facial data is stored, how long it's retained, or whether it could be repurposed later. For most homeowners, I'd recommend disabling facial recognition unless there's a clear, specific reason to use it. If someone does keep it enabled, basic precautions matter — limiting recognition to household members, reviewing stored data regularly, and being mindful that the camera covers a shared public-facing space, not a private room. Smart home security is useful, but it shouldn't quietly shift expectations around everyday surveillance. Facial recognition works best when consent is explicit and informed, which is difficult to guarantee with devices that monitor public or semi-public areas. Bio: Sai Upendra is a technology writer at HytechEra who focuses on consumer technology, mobile apps, and digital privacy, with an emphasis on how everyday users are affected by smart devices.