Next Article in Journal
Practices in Multiagency Collaboration against Violent Extremism at the City Level: Nordic Approaches
Previous Article in Journal
The Holy Grail: The Evaluation of Anti-Radicalization Policies in the Netherlands
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Abstract

Measuring the Effectiveness of In-School CVE Intervention Programs: Scope and Evaluation Methods †

1
Community Safety Branch, Emergency Preparedness Research Evaluation & Practice (EPREP) Program, Harvard T.H. Chan School of Public Health, Boston, MA 02115, USA
2
Center for Terrorism and Security Studies, University of Massachusetts Lowell, Lowell, MA 01854, USA
3
Operation 250, Lowell, MA 01854, USA
*
Author to whom correspondence should be addressed.
Presented at the Global Safety Evaluation Workshop, Online, 1 July–31 December 2020.
Published: 27 April 2021
(This article belongs to the Proceedings of Global Safety Evaluation (GSE) Network Workshop)

Abstract

:
This presentation outlines the results of the primary programmatic evaluation efforts the Emergency Preparedness Research Evaluation and Practice (EPREP) Program has conducted since 2016. The presentation begins with an overview of the methodology of selecting outcome measures to evaluate program efficacy, as well as a description of the evaluation framework. Results of the longitudinal and quasi-experimental 2017 evaluation of the Online4Good Academy—on of the training events at the focus of the Boston CVE Pilot Program—are presented and discussed. In 2018, the EPREP Program utilized a longitudinal and quasi-experimental design to evaluate the efficacy of the Peer2Peer antihatred campaign Kombat with Kindness. Results and implications from this study are discussed. The final portion of the presentation describes the more recent activities of the EPREP Program—an evaluation of the online safety program Operation 250 (OP250). This portion of the lecture describes the psychological framework and theory of change under which OP250 implements their initiative. During the final segment we also present the preliminary results of a randomized controlled trial conducted at two different study sites in Massachusetts designed to evaluate the programs’ efficacy.

Institutional Review Board Statement

The study was conducted according to the guidelines of the Declaration of Helsinki, and approved by the Institutional Review Board of Harvard T.H. Chan School of Public Health (IRB16-1757 approved on 24 January 2017).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data gathered during the course of the project are available upon request, due to research subjects’ privacy considerations, via the national public repository of data Criminal Justice Data (NACJD), hosted by the Inter-university Consortium for Political and Social Research (ICPSR; at the University of Michigan), and can be found at the following link: https://www.icpsr.umich.edu/web/NACJD/studies/37338.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Su, M.; Harriman, N.; Shortland, N.; Cote, T.; Savoia, E. Measuring the Effectiveness of In-School CVE Intervention Programs: Scope and Evaluation Methods. Proceedings 2021, 77, 14. https://0-doi-org.brum.beds.ac.uk/10.3390/proceedings2021077014

AMA Style

Su M, Harriman N, Shortland N, Cote T, Savoia E. Measuring the Effectiveness of In-School CVE Intervention Programs: Scope and Evaluation Methods. Proceedings. 2021; 77(1):14. https://0-doi-org.brum.beds.ac.uk/10.3390/proceedings2021077014

Chicago/Turabian Style

Su, Max, Nigel Harriman, Neil Shortland, Tyler Cote, and Elena Savoia. 2021. "Measuring the Effectiveness of In-School CVE Intervention Programs: Scope and Evaluation Methods" Proceedings 77, no. 1: 14. https://0-doi-org.brum.beds.ac.uk/10.3390/proceedings2021077014

Article Metrics

Back to TopTop