Justus Randolph has a PhD in education research and program evaluation, an MEd in international education, and a certification in educational administration. Currently, he is an Assistant Professor of Education at Mercer University. In the past, Justus Randolph has worked as a program evaluator or researcher for organizations such as the Center for Policy and Program Evaluation, The Worldwide Institute for Research and Evaluation, the National Center for Hearing Assessment and Management, Utah State University, the University of Joensuu, HAMK University of Applied Sciences, and the Logan City School District. His research and evaluation experiences have concerned programs that involve newborn hearing assessment, school improvement, higher education evaluation, technology-enriched playgrounds, educational technology research methods, and computing education. He has developed and taught courses in quantitative and qualitative research methods, evaluation, and scholarly writing. He is the author of the book Multidisciplinary Methods in Educational Technology Research and Development and tens of scholarly articles.
Randolph, J. J., Kangas, M., & Ruokamo, H. (2010). Predictors of Dutch and Finnish students' satisfaction with schooling. Journal of Happiness Studies, 11(22), 193-204.
Considerable research has shown that there are clear links among satisfaction with schooling, overall life satisfaction, and physical and psychological well-being. In this investigation, we expand on that line of research by identifying the predictors of overall satisfaction with schooling of 331 Dutch and Finnish pupils aged 6–13. Similar to previous research, student age, student gender, and teacher likeability were strong predictors of students’ overall satisfaction with schooling. New findings from this investigation were that teacher gender and class size were significant predictors of overall satisfaction with schooling.
Randolph, J. J. (2007). Multidisciplinary methods in educational technology research and development. H�meenlinna, Finland: HAMK Press.
Over the past thirty years, there has been much dialogue, and debate, about the conduct of educational technology research and development. In this brief volume, Justus Randolph helps clarify that dialogue by theoretically and empirically charting the research methods used in the field and provides much practical information on how to conduct educational technology research.
Within this text, readers can expect to find answers to the following questions:
- What are the methodological factors that need to be taken into consideration when designing and conducting educational technology research?
- What types of research questions do educational technology researchers tend to ask?
- How do educational technology researchers tend to conduct research? � What approaches do they use? What variables do they examine? What types of measures do they use? How do they report their research?
- How can the state of educational technology research be improved?
Randolph, J. J. (2007). Meta-analysis of the effects of response cards on student achievement, participation, and intervals of off-task behavior. Journal of Positive Behavior Interventions. 9(2), 113-128.
In this meta-analysis, 18 response card articles, theses, or dissertations were analyzed to determine the magnitude of effect that response card strategies have on test achievement, quiz achievement, participation, and intervals of off-task behavior. The 18 studies were also analyzed to determine whether the type of response cards used or the presence or absence of ceiling effects have a differential effect on study outcomes. Using the traditional method of hand-raising as a control condition, it was found that response cards have large, statistically significant effect sizes for test achievement, quiz achievement, participation, and reduction in intervals of disruptive behavior. No significant difference was found between types of response cards used. Although the difference was not statistically significant, studies with ceiling effects had, on average, effect sizes that were notably lower than studies without ceiling effects. Neither place of publication, type of publication, nor sample size were found to be significant moderators of effect sizes for academic achievement.
Randolph, J. J. & Eronen, P.J. (2007). Developing the Learning Door: A case study in Youth participatory program planning. Evaluation and Program Planning, 30(1), 55-65.
This article presents the results of a case study in youth participatory program planning conducted in the context of a nonformal technology-education program in eastern Finland. The purpose of the program was to have youth, university, and business stakeholders work together to create the Learning Door, a door that would meet the needs of older people and people with disabilities. The participatory program planning process that was used involved clarifying the mission, roles, and modes of collaboration as well as creating stakeholder matrices, logic models, program plans, and implementation plans. It was found that the observed program planning process was similar to the intended planning process and that the process was well received by the planning participants. The lessons learned include clarifying the nature of collaboration before the program gets underway, reviewing program planning steps often, and making clear distinctions between logic models and implementations plans.