We previously reported positive outcomes in mental health knowledge, help-seeking intention, and attitudes among participants in the Peer-to-Peer (P2P) Depression Awareness Program, a school-based mental health literacy program (
1) run by the University of Michigan Depression Center in partnership with public school districts. Each school in the program recruits student “peer advocates,” who design and implement schoolwide mental health promotional campaigns. School campaigns vary by both the number of activities initiated and repeated and their level of interactivity (activities involving two-way action between learner and instructor of an educational event) (
2). How such interventions work to educate is vital to optimizing impact. We investigated whether outcomes are linked to repetition/reinforcement as well as to interactivity; both are fundamental mechanisms for student knowledge acquisition (
3). We used dose-response evaluation (
4) to examine whether there was a link between student exposure to particular quantities (dose) and types of campaign activities.
Data for the present analyses were drawn from eight participating schools (there were nine participating schools, but one school was dropped due to a different student-body selection process) in the Washtenaw and Oakland County school districts in the 2017–2018 academic year to evaluate the effect of the P2P program on depression knowledge, mental health stigma, and help-seeking intention. Depression knowledge was measured by summing correct responses to depression symptom identification and true-false questions about depression; the maximum score is 12. Stigma was measured with a test derived from the Revised Attribution Questionnaire (r-AQ), where a higher score indicates higher prevalence of stigma; the maximum score is 57. Help-seeking intention was measured by summing how likely students would approach certain members of the community for help (1, not likely; 4, very likely); the maximum score is 56. There were 616 observations used for the depression knowledge model, 578 for the stigma model, and 579 for the help-seeking model. Two researchers independently evaluated whether a particular campaign activity was interactive according to the noted definition. Effect size is defined as the difference between post- and pretest (baseline) scores. An ideal effect size in this case would be negative for a decrease in stigma levels and positive for an increase in depression knowledge and help-seeking intention.
For each domain, we fitted a linear model with effect size as the response and the total number of activities overall plus the total number of interactive activities in the school’s campaign (referred to as activity-interactivity henceforth) as covariates, adjusting for grade, campaign visibility, and pretest score (
5). Results were mixed but suggest an association between activity-interactivity and effect size. We found that adding an interactive activity to the P2P awareness campaign significantly reduced students’ stigma score by 0.47 point (95% confidence interval [CI]=–0.88, –0.07; p<.05). In addition, compared with a school campaign of two activities, a school campaign with five activities significantly improved the depression knowledge score by 2.12 points (95% CI=1.34, 2.90; p<.05) and decreased stigma scores by 0.63 point (95% CI=–1.36, 2.62; p<.05) but also significantly decreased help-seeking scores by 2.78 points (95% CI=0.29, 5.26; p<.05). We note that the adjusted R
2 for model fit was still relatively low (0.31, 0.17, and 0.19 for depression knowledge, stigma, and help-seeking models, respectively), which may improve by including other covariates (e.g., socioeconomic status) in the model in future research.
Although results were mixed and we would have liked to see help-seeking scores increase rather than decrease, the significance of number of activities and level of interactivity in the schools’ campaigns should not be overlooked. We found that school-based mental health intervention programs should be both repetitious and more interactive. Worth noting is that our data are disproportionate; five schools had campaigns with four activities, whereas only one school had campaigns with two activities, one had three activities, and one had five activities. Future research should further examine and validate the dose-response relationship we have identified in order to refine and enhance mental health literacy programs.