May
Workshop and Presentation on Sensitivity Analysis

Lund Social Science Methods Centre together with Moira Nelson at the Department of Political Science, are organising a two-day workshop on Sensitivity Analysis: "What would it take to Change your Inference? A Common Language for Engaging a Broad Set of Stakeholders in Discourse about Causal Inferences"
Below you will find all information about the workshop!
Workshop 20 May at 13–17
Statistical inferences are often challenged because of uncontrolled bias. There may be bias due to uncontrolled confounding variables or non-random selection into a sample. We will turn concerns about potential bias into questions about how much bias there must be to invalidate an inference. For example, challenges such as “But the inference of a treatment effect might not be valid because of pre-existing differences between the treatment groups” are transformed to questions such as “How much bias must there have been due to uncontrolled pre-existing differences to make the inference invalid?”
By reframing challenges about bias in terms of specific quantities, this workshop will contribute to scientific discourse about uncertainty of causal inferences. Critically, while there are other approaches to quantifying the sensitivity of inferences (e.g., Robins and Rotnitzky, 1995; Rosenbaum and Rubin, 1983; Rosenbaum, 2000), the approaches presented in this workshop based on correlations of omitted variables (Frank, 2000) and the replacement of cases (Frank and Min, 2007; Frank et al, 2013) have greater intuitive appeal. In this sense the techniques inform a conversation among a broad set of stakeholders and help apply research evidence to practice.
Activities
This is a 4-hour workshop (including multiple breaks) with a mixture of presentation, individual hands-on exploration, group work and informal interactions. The interactive activities will be face-to-face, although the presented material can be accessed via zoom which can be recorded for asynchronous viewing. The target audience is anyone familiar with general quantitative methods including regression and analysis of variance as well as more sophisticated techniques (including multilevel models and logistic regression).
In part I, we use Rubin’s causal model to interpret how much bias there must be to invalidate an inference in terms of replacing observed cases with counterfactual cases or cases from an unsampled population (e.g., Frank et al, 2013). This includes application to logistic regression (Frank et al., 2021). In part II, we quantify the robustness of causal inferences in terms of correlations associated with unobserved variables or in unsampled populations (e.g., Frank 2000). The techniques can be applied to linear models and related extensions (e.g., multilevel models, mediation, propensity models). Calculations will be presented using the app https://konfound-project.shinyapps.io/konfound-it/ with links to STATA and R modules.
Presentation 21 May at 10–11
Abstract
Beginning with debates about the effects of smoking on lung cancer, sensitivity analyses characterizing the hypothetical unobserved conditions that can alter statistical inferences have had profound impacts on public policy. One of the most ascendant techniques for sensitivity analysis is Oster’s coefficient of proportionality, which approximates how strong selection into a treatment on unobserved variables must be compared to selection on observed variables to change an inference. We refine Oster’s asymptotic approximation by deriving expressions for the correlations associated with an omitted variable that reduce an estimated effect to a specified threshold, given a corresponding coefficient of determination (R2). We verify our expressions through empirical examples and simulated data. Because our calculations are exact, they apply regardless of sample size. In contrast, Oster’s approximation is likely to overstate robustness when sample size is small and observed covariates account for a large portion of an estimated effect relative to a baseline model.
Moreover, unlike Oster’s estimator, our correlation-based expressions do not depend on the analyst’s choice of baseline model. Furthermore, our correlation-based expressions can be directly calculated from conventionally reported quantities through commands in R or Stata and an on-line app, and therefore can be applied to most published studies.
Instructor bios
Kenneth Frank received his Ph.D. in measurement, evaluation and statistical analysis from the School of Education at the University of Chicago in 1993. He is MSU Foundation professor of Sociometrics, professor in Counseling, Educational Psychology and Special Education; and adjunct (by courtesy) in Fisheries and Wildlife and Sociology at Michigan State University. Dr. Frank has published articles across the social sciences on quantifying the robustness of causal inferences including in Sociological Methodology; Sociological Methods and Research; Journal of Educational and Behavioral Statistics; Education, Evaluation and Policy Analysis; and Journal of Clinical Epidemiology. This work has been cited in top journals in sociology (e.g., American Sociological Review; Social Forces; Sociology of Education; Journal of Marriage and Family) as well as in economics, psychology, political science, public policy, and health. See published examples under “more” at http://konfound-it.com. He has also developed the app konfound-it.com and R and STATA macros and gave the Duncan lecture at ASA in 2020 on sensitivity analysis.
Qinyun Lin is a Senior Lecturer in the School of Public Health and Community Medicine, University of Gothenburg. Most broadly, she is a methodologist invested in humanizing quantitative research to better understand mechanisms underlying inequities, particularly health and educational disparities, and, ultimately, informing policymaking. Towards this, she seeks to advance and expand existing quantitative methods and the interpretation of inferences by incorporating human interactions, social and spatial contexts, and mediating factors within these analyses. Her works have appeared in high-impact peer-reviewed journals such as JAMA Network Open, Social Science & Medicine, and Psychological Methods.
Background readings
*Frank, K. A., *Lin, Q., *Maroulis, S., *Mueller, A. S., Xu, R., Rosenberg, J. M., ... & Zhang, L. (2021). Hypothetical case replacement can be used to quantify the robustness of trial results. Journal of Clinical Epidemiology, 134, 150-159. *authors listed alphabetically.
Xu, R., Frank, K. A., Maroulis, S. J., & Rosenberg, J. M. (2019). konfound: Command to quantify robustness of causal inferences. The Stata Journal, 19(3), 523-550.
Register for the workshop
Last day for registering to this activity has passed.
Practicalities
The workshop 20 May is between 13.00 and 17.00 including a coffee break.
The presentation 21 May is between 10.00 and 11.00. Coffee and rolls will be served from 09:30.
Organisers:
Moira Nelson, Associate Professor and Senior Lecturer at the Department of Political Science, Lund University
Qinyun Lin, Senior Lecturer at School of Public Health and Community Medicine, University of Gothenburg
Kenneth Frank, Professor of Sociometrics at the Michigan State University Foundation
We look forward to seeing you at the workshop in May!
Arranged by: Lund Social Science Methods Centre
About the event
Location:
Sh208, Gamla Köket (School of Social Work), Allhelgona kyrkogata 8, Lund
Target group:
Researchers in the social sciences, though potentially other fields as well.
Language:
In English
Contact:
moira [dot] nelson [at] svet [dot] lu [dot] se