Privacy Anxiety and Platform Migration on Social Media

How platform governance changes activate and amplify privacy anxiety across online communities

Social MediaAI EthicsPlatform GovernancePrivacy AnxietyMixed MethodsPolicy AnalysisTopic ModelingPolicy

Project Snapshot

  • Timeframe: March 2025 – January 2026
  • Category: Research, AI Ethics, Social Media
  • Role: Researcher & System Designer

Project Description

  • This thesis examines how a Terms of Service update on X enabling default AI training on user content activated privacy anxiety and reshaped user behavior. Privacy anxiety is framed as a structural outcome of uncertainty, power asymmetries, and loss of control over data use, especially for content creators. The study shows that anxiety first emerged within creator communities and then diffused across user groups through inter-community interaction, forming a platform-wide affective atmosphere. Algorithmic systems did not initiate this process but selectively intensified it by increasing the visibility of anxiety-intense privacy-related content. As anxiety escalated, community cohesion weakened, engagement declined, and large-scale digital migration followed, mostly creators. Overall, the findings highlight how AI-driven platform governance can activate and scale privacy anxiety, leading to trust breakdown and user migration.