top of page
  • APSI

Does APSI Training Make A Difference? Some Evidence That It Does! – Pt 2

During the last 3 years, APSI has been engaged in extensive work to revise and update UPC into our new series called Foundations of Prevention Science and Practice, in addition to addressing the need for virtual trainings using the most effective strategies to deliver the programming. Several considerations and accommodations had to be made in our training approach:

· What was the ideal length of time for online training sessions? How can we hold their attention?

· What are the possible distractions they will have at home: children, pets, family members passing by, and their work. During COVID the number of e-mails received doubled or even tripled!

· Another consideration was the learning process itself. We had created in UPC a coherent and connected learning process. Face-to-face training allows for countless dynamics and strategies to help the learner learn. Virtually, we needed to be very creative and very mindful of the questions and exercises that will help everyone in the group – the visual learner, the kinesthetic learner and the auditory learner.

· How could we make sure our learners are retaining the knowledge we are providing, and then keeping it for use in practice later on?

· And finally--participation, engagement, and intention to apply what they learn to their work. Keeping them engaged and interacting in the sessions is important.

To address attention span and avoid “death by webinar and power point,” we split the whole training into 3-hour virtual sessions with one 15-minute break. We included break-out sessions and other activities that focused on participants’ own work and experience and helped the process of incorporating new knowledge into their practice. These also renewed energy levels and bonds among participants. When it came to distractions, we made sure to acknowledge them, and to be flexible when the participants had something very urgent to attend to. We always begin our training stating the ground rules and that we are counting on them to be present. We strongly encourage open videos at all times.

In terms of the learning process: we reviewed all the interactive sessions we had on our face-to-face training and reframed them using the tools we had available while maintaining the objective of the exercise – what are they supposed to learn? How can they connect? What is the best way for sharing? How big (or small) can the groups be? What tools do we need to enhance participation as much as possible?

For knowledge retention we decided to incorporate short quizzes as we moved along in the training. We incorporated Kahoot! in all modules at different times with key content so we could see where they are on understanding and retaining information. Each question we ask in the Kahoot addresses knowledge content and is designed to refresh and enhance knowledge.

Finally, one of the most important components we have and that we always stress, is that participants complete an evaluation at the end of each module. Based on these evaluations, we have improved the training process: change the style, do more or less of a specific strategy or exercise, allow more or less time for groups, etc. In essence we have tailored the training to the needs of the training participants to provide them with the best possible experience and, of course, achieve our learning objectives. We can say that nobody experiences the same APSI training twice: we are always tweaking delivery process and updating the research to provide our trainees the best experience, whether virtual or in-person.

Post-COVID, we have maintained many of these enhancements to our revised, updated curriculum that was renamed Foundations of Prevention Science and Practice. Foundations still includes an introductory course and courses focusing on interventions provided to the family, in schools, workplaces, and the environment, through the media all with an emphasis on building community infrastructures to support a comprehensive and integrated system of prevention programs the meet the needs of communities and is supported with a strong monitoring and evaluation components.

Below are examples of the pre-/post-test scores from the cohorts of U.S. prevention practitioners who have participated in the Foundations introductory courses. The range of scores is important to observe. A shortened range indicates more shared knowledge. Please note that in any one training we limit the training capacity to 30 to optimize engagement and learning.

In addition to the pre-/post-tests for knowledge we also have the evaluations of the trainings that have been very positive. The module evaluation uses a Likert scale of Strongly Agree, Agree, Neutral, Disagree, and Strongly Disagree for responses to a series of statements about the Module training. Among the statements included in the evaluation are:

· The training topics were relevant to my work.

· I expect to use the information gained from this training.

· I would recommend this training to a colleague.

Overall, between 70% and 100% of the trainees marked Agree or Strongly Agree to the three statements which is an encouraging measure of meeting our training objectives.

84 views0 comments


bottom of page