Counterfactual Experiment Design

Causal inference frameworks for auditing algorithmic systems

This project develops novel methodological frameworks for causally estimating the effects of algorithmic systems using counterfactual experimental designs. We pioneered the use of “counterfactual bots” that replicate user behaviors under different algorithmic exposure conditions.

Our counterfactual bot framework enables causal inference in platform studies by creating digital twins that experience different algorithmic treatments while controlling for user behavior.

Key methodological contributions include:

  • Counterfactual bot design for platform auditing and causal inference
  • Digital twin methodologies that separate user agency from algorithmic influence
  • Cross-platform experimental frameworks for comparative algorithmic studies
  • Causal inference tools adapted for sociotechnical systems research

This work, published in PNAS, established new standards for rigorous causal evaluation of algorithmic systems and demonstrated that user intent often outweighs algorithmic bias in content exposure patterns. The methodology is now being applied across domains including youth mental health, misinformation, and content recommendation systems.

Ongoing Projects

Investigating the Influence of Social Media Content and Algorithms on Adolescent Mood

This ongoing research applies our counterfactual bot methodology to study how social media platforms affect adolescent mental health and mood. By creating digital twins of adolescent users, we aim to causally estimate the effects of different types of content and algorithmic recommendation strategies on emotional well-being, separating the influence of user behavior from platform design choices.

References

2024

  1. bots.png
    Causally estimating the effect of YouTube’s recommender system using counterfactual bots
    Homa Hosseinmardi, Amir Ghasemian, Miguel Rivera-Lanas, Manoel Horta Ribeiro, Robert West, and Duncan J Watts
    Proceedings of the National Academy of Sciences, 2024