← Dark Pattern Encyclopedia
Privacy Violation☠️ War Crime

Privacy Zuckering

AKA: The Default-Public Trap · Social Overexposure · The Permissive Default

Bureau Classification

What It Does

Privacy Zuckering configures default settings to maximize data sharing, social exposure, or behavioral tracking, operating on the well-documented principle that most users never change defaults. When a platform launches with public-by-default profiles, default-on location sharing, and default-broad data sharing with "partners," it is making a deliberate choice about who benefits from user inertia — and that choice is consistently not the user. The pattern extends beyond social media: browser fingerprinting enabled by default, cookie consent with legitimate interest pre-ticked, app permissions defaulting to "allow always," smart device telemetry defaulting to "full data sharing." The label comes from Mark Zuckerberg's documented belief, expressed in early Facebook communications, that people who willingly share data are displaying "dumb f***" behavior — a philosophy that did not prevent the company from being architected to maximize exactly that sharing.

Why It Works

Default effects are among the most powerful forces in behavioral economics. People systematically stick with default settings across domains — organ donation rates, retirement savings contributions, software configurations — because changing a default requires a decision, and decisions require effort. Privacy Zuckering weaponizes this tendency by placing the maximum-sharing option in the position of least cognitive resistance. It also benefits from information asymmetry: users rarely know what the defaults are, what they mean in practice, or what data sharing entails. By the time they find out, the data has been collected, shared, and processed for years.

How To Spot It

Review the privacy and sharing settings of any new platform immediately after signup, before using the product. Assume defaults are permissive until verified otherwise. Check location permissions, ad tracking settings, profile visibility, data sharing with third parties, and telemetry options. If navigating to full privacy requires multiple submenus, you are dealing with a design that treats privacy as an edge case.

Documented Incidents

#01

Facebook: profiles defaulted to public for years; photo tags, friend lists, and relationship status shared broadly by default

#02

Google: location history enabled by default across products; ad personalization on by default; data sharing across Google services default-on

#03

LinkedIn: "open to work" and visibility settings defaulting to maximum exposure; profile appearing in partner sites by default

#04

Smart TVs: ACR (automatic content recognition) tracking enabled by default, requiring settings navigation to disable

#05

Mobile apps: "allow always" location prompts designed to obscure the "only while using" option; permissions defaulting to broad access

Body Count

The personal data of billions of users has been collected, profiled, and monetized under default settings those users never meaningfully chose. The downstream effects — targeted manipulation, insurance discrimination, identity theft, and the general erosion of informational self-determination — cannot be expressed in a single statistic. The Bureau notes that this is precisely the point.

Legal Status

GDPR requires that data protection be enabled by default ("data protection by default" under Article 25), specifically prohibiting the Privacy Zuckering model for EU users. The FTC has brought multiple actions against companies for deceptive privacy defaults. The California Privacy Rights Act contains default-privacy provisions. Enforcement has produced significant fines but has not restructurally changed industry default practices.

Bureau Verdict

"Privacy Zuckering is classified as a war crime because it operates at population scale, has been deployed deliberately and knowingly, and has produced measurable harm to billions of people. The Bureau is not using "war crime" loosely here. The pattern represents a systematic, intentional choice to configure digital infrastructure in a way that extracts maximum value from users while providing minimum transparency. The fact that this is legal in most of the world is a regulatory failure, not an ethical endorsement."

— Bureau of Non-Consensual Cookie Bandits

Frequently Asked Questions

Companies Caught Using This Pattern

Full audits available in the Privacy Policy Hall of Shame.