About - CCRAM

Meet our experts

At CCRAM, you will learn directly from experts in behavioural science data analysis, design, and methods.

The Canadian Centre for Research Analysis and Methods (CCRAM) is the preeminent Canadian destination for academics and researchers in government and industry anywhere in the world to learn from Canada’s leading behavioural science methodologists. Among its many services, CCRAM offers online and in-person seminars from Canada-affiliated researchers and educators who focus their research on the development and dissemination of methodological knowledge and tools useful to the behavioural scientist. CCRAM seminars cater to scientists across various disciplines – social sciences, health, business, or elsewhere in academia, government or industry – who want to learn directly from methodology experts in data analysis, design, and methods.

Meet our experts

Andrew F. Hayes, Ph.D.

Dr. Andrew F. Hayes, PhD

University of Calgary

Courses taught for CCRAM:

  • Introduction to Mediation, Moderation, and Conditional Process Analysis
  • Mediation, Moderation, and Conditional Process Analysis: A Second Course
  • Mediation, Moderation, and Conditional Process Analysis: The Complete Course


Dr. Andrew Hayes is a Distinguished Research Professor at the Haskayne School of Business at the University of Calgary (with an adjunct appointment in the Department of Psychology) and the founder and academic director of the Canadian Centre for Research Methods and Analysis.

Hayes invented the PROCESS macro for SPSS, SAS and R, widely used by researchers examining the mechanisms and contingencies of effects. He has written extensively on research methodology including Introduction to Mediation, Moderation, and Conditional Process Analysis (2022), Regression Analysis and Linear Models (2017) and Statistical Methods for Communication Science (2005). His work in books and journal articles in methodology as well as various substantive areas has been cited well over 200,000 times according to Google Scholar, and he has been designated a Highly Cited Researcher by Clarivate Analytics in 2019, 2020, 2021, 2022, and 2023.

He teaches courses on applied data analysis and also conducts online and in-person workshops on statistical analysis to multidisciplinary audiences throughout the world, most frequently to faculty and graduate students in business schools but also in education, psychology, social work, communication, public health and government researchers


  • Hayes, A. F. (2022). Introduction to mediation, moderation, and conditional process analysis: A regression-based approach (3rd edition). New York: The Guilford Press.
  • Hayes, A. F. (2017). Regression analysis and linear models: Concepts, applications, and implementation. New York: The Guilford Press.
  • Hayes, A. F. (2005). Statistical methods for communication science. Routlege/Taylor and Francis.
  • Coutts, J. J., & Hayes, A. F. (in press). Questions of value, questions of magnitude: An exploration and application of methods for comparing indirect effects in multiple mediator models. Behavior Research Methods.
  • Rockwood, N. J., & Hayes, A. F. (2022). Multilevel mediation analysis. In A. A. O'Connell, D. B. McCoach, and B. Bell (Eds). Multilevel modeling methods with introductory and advanced applications. Information Age Publishing.
  • Igartua, J.-J., & Hayes, A. F. (2021). Mediation, moderation, and conditional process analysis: Concepts, computations, and some common confusions. Spanish Journal of Psychology, 24, e49.
  • Hayes, A. F., & Rockwood, N. J. (2020). Conditional process analysis: Concepts, computations, and advances in the modeling of the contingencies of mechanisms. American Behavioral Scientist, 64, 19-54.
  • Coutts, J. J., Hayes, A. F., & Jiang, T. (2019). Easy statistical mediation analysis with distinguishable dyadic data. Journal of Communication, 69, 612-649.
  • Hayes, A. F. (2018). Partial, conditional, and moderated moderated mediation: Quantification, inference, and interpretation. Communication Monographs, 85, 4-40.
  • Hayes, A. F., & Rockwood, N. J. (2017). Regression-based statistical mediation and moderation analysis in clinical research: Observations, recommendations, and implementation. Behaviour Research and Therapy, 98, 39-57.
  • Hayes, A. F., Montoya, A. K., & Rockwood, N. J. (2017). The analysis of mechanisms and their contingencies: PROCESS versus structural equation modeling. Australasian Marketing Journal, 25, 76-81.
  • Hayes, A. F., & Montoya, A. K. (2017). A tutorial on testing, visualizing, and probing interaction involving a multicategorical variable in linear regression analysis. Communication Methods, and Measures, 11, 1-30.
  • Montoya, A. K., & Hayes, A. F. (2017). Two condition within-participant statistical mediation analysis: A path-analytic framework. Psychological Methods, 22, 6-27.
  • Hayes, A. F. (2015). An index and test of linear moderated mediation. Multivariate Behavioral Research, 50, 1-22.
  • Hayes, A. F. (2014). Statistical mediation analysis with a multicategorical independent variable. British Journal of Mathematical and Statistical Psychology, 67, 451-470.
  • Hayes, A. F., & Scharkow, M. (2013). The relative trustworthiness of inferential tests of the indirect effect in statistical mediation analysis: Does method really matter? Psychological Science, 24, 1918-1927.
  • Hayes, A. F., & Preacher, K. J. (2010). Estimating and testing indirect effects in simple mediation models with the constituent paths are nonlinear. Multivariate Behavioral Research, 45, 627-660.
  • Hayes, A. F. (2009). Beyond Baron and Kenny: Statistical mediation analysis in the new millennium. Communication Monographs, 76, 408-420.
  • Preacher, K. J., & Hayes, A. F. (2008). Asymptotic and resampling strategies for assessing and comparing indirect effects in multiple mediator models. Behavior Research Methods, 40, 879-891.

doug baer

Dr. Doug Baer, PhD

University of Victoria (Emeritus)

Courses taught for CCRAM:

Introduction to Structural Equation Modeling



Doug Baer is Professor Emeritus at the University of Victoria. Prior to his retirement, he was Academic Director of the University Victoria Branch Statistics Canada Research Data Centre, which he founded in 2006. He has served on the Canada Research Data Centre Network National Committee, the Academic Directors’ Council and the CRDCN Executive, as the former President of the Canadian Sociological Association, and as a member of grant panels for the Canada Foundation for Innovation and the Social Sciences and Humanities Research Council of Canada.

Since 1986, Doug has taught quantitative methods courses at his university as well as at various organizations including the International Consortium for Political and Social Research, the Global School for Empirical Research Methods, and for the Canadian Demographic Association/Statistics Canada, among many others.  His work has been published in various sociology journals, including the American Sociological Review and Social Forces.  

Hear Dr. Baer talk about his pleasure teaching methods: [video]

Felix Cheung

Dr. Felix Cheung, PhD

University of Toronto

Courses taught for CCRAM:
Doing Open and Replicable Science


Dr. Felix Cheung is an Assistant Professor at the University of Toronto in the Department of Psychology. Cheung researches the determinants and consequences of subjective well-being across diverse populations. He also studies meta-science - the scientific study of science. Cheung examines how the reliability of scientific findings can be potentially improved by 'Big Science' (studies done by large collaborative teams), open science practices (pre-registration and data sharing) and research incentives.

  • Landy, J. F., Jia, M., Ding, I. L., Viganola, D., Tierney, W., Dreber, A…, Cheung, F., ... Uhlmann, E. L. (2020). Crowdsourcing hypothesis tests: Making transparent how design choices shape research results. Psychological Bulletin, 146(5), 451-479.
  • Silberzahn, R., Uhlmann, E.L., Martin, D.P., Anselmi, P., Aust, F., Awtrey, E., Cheung, F., … Nosek, B. A. (2018). Many analysts, one dataset: Making transparent how variations in analytical choices affect results. Advances in Methods and Practices in Psychological Science, 1, 337-356.
  • Anderson, C. J., Bahník, Š., Barnett-Cowan, M., Bosco, F. A., Chandler, J., Chartier, C. R.,  … Cheung, F., …, & Zuni, K. (2016) Response to a comment on “Estimating the reproducibility of psychological science”. Science, 351(6277), 1037
  • Schweinsberg, M., Madan, N., Vianello, M., Sommer, S. A., Jordan, J., Tierney, W., Awtrey, E., Zhu, L., … Cheung, F., … , & Uhlmann, E. L. (2016). The pipeline project: Pre-publication independent replications of a single laboratory's research pipeline. Journal of Experimental Social Psychology, 66, 55-67.
  • Tierney, W., Schweinsberg, M., Jordan, J., Kennedy, D. M., Qureshi, I., Sommer, S.A., … Cheung, F., …, & Uhlmann, E. L. (2016). Data from a pre-publication independent replication initiative: The pipeline project. Scientific Data, 3, 160082.
  • Open Science Collaboration. (2015). Estimating the reproducibility of psychological science. Science, 349(6251).
  • Johnson, D.J., Cheung, F., & Donnellan, M.B. (2014). Hunting for artifacts: The perils of dismissing inconsistent replication results. Social Psychology, 45, 318-320.

Hadi Farborzi

Dr. Hadi Fariborzi, PhD

Mount Royal University

Courses taught for CCRAM:
Systematic Review and Meta-Analysis


Dr. Hadi Fariborzi is an Assistant Professor of Innovation and Entrepreneurship at Mount Royal University. Dr. Hadi’s primary areas of research are international entrepreneurship, growth of small/new ventures and national cultures with a particular focus on systematic reviews and meta-analysis. Hadi co-founded the online meta-analytic platform HubMeta and has managed multiple teams of research assistants in different meta-analysis projects globally. He has taught multiple meta-analysis/systematic review courses and workshops around the world. His work has been published in journals such as the Journal of International Business Studies, Journal of World Business, and Small Business Economics

Jessica Flake

Dr. Jessica Flake, PhD

McGill University

Courses taught for CCRAM:
Scale Development and Psychometrics


Dr. Jessica Flake is an Assistant Professor of Quantitative Psychology and Modelling at McGill University. She regularly teaches measurement and statistics courses as well as workshops at international conferences. Her work focuses on technical and applied aspects of psychological measurement including scale development, psychometric modelling and scale use and replicability and is published in top journals such as Nature: Human Behavior, Psychological Methods, Advances in Methods and Practices in Psychological Science, Structural Equation Modeling, Psychological Science and the Journal of Personality and Social Psychology. Dr. Flake was named an Association of Psychological Science Rising Star in 2021 and received a Society for the Improvement of Psychological Science Commendation in 2020 for her research into questionable measurement practices.

She also works in applied psychometrics as a technical advisory panel member for the Enrollment management Association, a non-profit that develops educational assessments, and serves as the Assistant Director for Methods at the Psychological Science Accelerator, a laboratory network that conducts large-scale studies.


  • Luong, R. & Flake, J.K. (in press). Measurement invariance testing using confirmatory factor analysis and alignment optimization: A tutorial for transparent analysis planning and reporting. Psychological Methods.
  • Flake, J. K., Luong, R., & Shaw, M. (2022). Addressing a crisis of generalizability with large-scale construct validation. Behavioral and Brain Sciences, 45, e14.
  • Flake, J. K., Davidson, I., J., Wong, O., & Pek, J. (2022). Construct validity and the validity of replication studies: A systematic review. American Psychologist, 77, 576-588.
  • Flake, J.K. (2021). Strengthening the foundation of educational psychology by integrating construct validation into open science reform. Educational Psychologist. 56, 132-141.
  • Beymer, P.N., Ferland, M., & Flake, J.K. (2021). Validity evidence for a short scale of college students’ perceptions of cost. Current Psychology, 1-20.
  • Hwang, H., Cho, G., Jung, K., Falk, C., Flake, J.K., & Jin, M. (2021). An approach to structural equation modeling with both factors and components: Integrated generalized structured component analysis. Psychological Methods, 26, 273–294
  • Flake, J.K., & Fried, E.I. (2020). Measurement schmeasurement: Questionable measurement practices and how to avoid them. Advances in Methods and Practices in Psychological Science, 3, 456-465.
  • Shaw, M., Cloos, L., Luong, R., Elbaz, S. & Flake, J.K. (2020). Measurement practices in large-scale replications: Insights from Many Labs 2. Canadian Psychology, 61, 289-298.
  • Hehman, E., Calanchini, J., Flake, J. K., & Leitner, J. B. (2019). Establishing construct validity evidence for regional measures of explicit and implicit racial bias.  Journal of Experimental Psychology: General. 148 (6) 1022-1040.
  • Flake, J.K., & McCoach, D.B. (2018). An investigation of the alignment method with polytomous indicators under conditions of partial measurement invariance. Structural Equation Modeling: A Multidisciplinary Journal, 25 (1), 56-70.
  • Flake, J.K., Pek, J., & Hehman, E. (2017). Construct validation in social and personality research: Current practice and recommendations. Social Psychological and Personality Science, 8, 370-378
  • Flora, D. & Flake, J.K. (2017). The purpose and practice of exploratory and confirmatory factor analysis in psychological research: Decisions for scale development and validation. Canadian Journal of Behavioural Science, 49, 78-88.
  • Goldstein, J., & Flake, J.K. (2016). Towards a framework for the validation of early childhood assessment systems. Educational Assessment, Evaluation and Accountability, 23, 273-293 .
  • Flake, J. K., Barron, K. E., Hulleman, C., McCoach, B. D., & Welsh, M. E. (2015). Measuring cost: The forgotten component of expectancy-value theory. Contemporary Educational Psychology, 41, 232–244.

Jenny Godley

Dr. Jenny Godley, PhD

University of Calgary

Courses taught for CCRAM:
Introduction to Social Network Analysis


Dr. Jenny Godley is an Associate Professor in the Department of Sociology and an Adjunct Associate Professor in the Department of Community Health Sciences, Cumming School of Medicine at the University of Calgary. Trained as a quantitative sociologist, she uses demographic and social network analytic techniques to examine the processes which lead to and perpetuate social inequalities in health. She has experience analyzing large, population-level datasets and administrative data and collecting and analyzing both whole and ego-centred social network data. She is a popular undergraduate teacher and an award-winning graduate supervisor.

Godley is also the Chair of the Conjoint Faculties Research Ethics Board at the University of Calgary and has a particular interest in the ethical issues which arise when collecting social network data.

Haan, M., & Godley, J. (2016). An introduction to statistics for Canadian social scientists (3rd Ed). Oxford University Press.

  • Stearns, J.A., Godley, J., Veuglers, P.J., Ekwaru, J.P., Bastian, K., Wu, B., and Spence, J.C. (2019). Association of friendships and children’s physical activity during and outside of school: A social network study. Social Science and Medicine – Population Health, 7.
  • McIntyre, L., Jessiman-Perreault, G., Mah, C. and Godley, J. (2018). A socialnetwork analysis of Canadian food insecurity policy actors. Canadian  Journal of Dietetic Practice and Research. 79(2):60-66.
  • Godley, J., Glenn, N.M., Sharma, A.M. and Spence, J.C. (2014). Networks of Trainees: Examining the effects of attending an interdisciplinary research training camp on the careers of new obesity scholars. Journal of Multidisciplinary Healthcare. 7:459-470.
  • Godley, J., Sharkey, K.A. and Weiss, S. (2013). Networks of Neuroscientists: Professional interactions within an interdisciplinary brain research institute. Journal of Research Administration. 44(2):94-126.
  • Godley, J., Barron, G. and Sharma, A. (2011). Using social network analysis to assess collaboration in health research. The Journal of Healthcare, Science and the Humanities. 1(2):99-116.
  • Haines, V.A., Godley, J. and Hawe, P. (2011). Understanding interdisciplinary collaborations as social networks.  American Journal of Community Psychology. 47:1-11.
  • Fur, R., Henderson, E.A. Read, R.R., Godley, J., Roy, C. and Bush, K. (2011). The use of social network analysis to quantify the importance of sex partner meeting venues in an infectious syphilis outbreak in Alberta, Canada. Sexually Transmitted Infections. 87: A164-A165.
  • Godley, J. and Russell-Mayhew, S. (2010). Interprofessional relationships in the field of obesity: Data from Canada. Journal of Research in Interprofessional Practice and Education. 1(2):88-108.
  • Godley, J. (2008). Preference or propinquity?  The relative influence of selection and opportunity on friendship homophily in college. Connections. 28(2):65-80.
  • Godley, J. (2001). The influence of sibling ties on women’s contraceptive method choice in Nang Rong, Thailand. International Family Planning Perspectives. 27(1):4-10. 

Murtaza Haider

Dr. Murtaza Haider, PhD

Toronto Metropolitan University

CCRAM Advisory Board Member


Dr. Murtaza Haider is a professor of Data Science and Real Estate Management at Toronto Metropolitan University. He also serves as research director of the Urban Analytics Institute and Director of Regionomics Inc., a boutique consulting firm specializing in the economics of cities and regions. He holds an adjunct professorship in Engineering at McGill University. His research interests include business analytics, data science, housing market dynamics, transport/infrastructure/urban planning, and human developing in Canada and South Asia. He is a syndicated columnist with Post Media and writes weekly columns on real estate markets that appear nationally in various Canadian news outlets.

Haider, M. (2016). Getting started with data science: Making sense of data with analytics. IBMBooks.

Gerald Haubl

Dr. Gerald Häubl, PhD

University of Alberta

Courses taught for CCRAM:
Experimental Methods for Behavioural Science


Dr. Gerald Häubl is the Ronald K. Banister Chair in Business and a Professor of Marketing at the University of Alberta School of Business. His primary research areas are judgment and decision making, consumer psychology, the construction of preferences and value, the enjoyment of consumption experiences, financial decision making, choice architecture, real-time decision assistance for consumers, motivation and goal pursuit, self-control, and bidding behavior in interactive-pricing markets. Gerald’s research has been published in numerous top-tier academic journals, including Psychometrika, Psychological Science, the Journal of Experimental Psychology, the Journal of Personality and Social Psychology, Information Systems Research, MIS Quarterly, the Journal of Consumer Research, the Journal of Consumer Psychology, the Journal of Marketing, the Journal of Marketing Research, and Marketing Science. Gerald currently teaches a course on experimental methods for behavioral science and a seminar on consumer behavior, both at the PhD level.

Dr. Andrea Howard

Dr. Andrea Howard, PhD

Carleton University


Dr. Howard is a developmental psychologist with expertise in multilevel analysis and structural equation modelling. She completed her graduate training in developmental psychology at the University of Alberta and postdoctoral training at the University of North Carolina at Chapel Hill in quantitative psychology. She is now an Associate Professor of developmental and quantitative psychology at Carleton University, regularly teaching graduate courses in ANOVA, regression, multilevel modelling, and structural equation modelling. She is Past-Chair of the Section on Quantitative Methods of the Canadian Psychological Association.

Alongside her primary program of research on mental health and substance use in the transition to adulthood, Dr. Howard has published several empirical and pedagogical articles related to longitudinal data analysis. 

  • Howard, A. L., & Lamb, M. (2023). Compliance trends in a 14-week ecological momentary assessment study of undergraduate alcohol drinkers. Assessment.

  • Bradley, A. H. M., & Howard, A. L. (2023). Stress and mood associations with smartphone use in university students: A 12-week longitudinal study. Clinical Psychological Science.

  • Howard, A. L. (2021). A guide to visualizing trajectories of change with confidence bands and raw data. Advances in Methods and Practices in Psychological Science, 4(4), 1-13.
  • Bainter, S. A. & Howard, A. L. (2016). Comparing within-person effects from multivariate longitudinal models. Developmental Psychology, 52(11), 1955-1968.
  • Howard, A. L. (2015). Leveraging time-varying covariates to test within- and between-person effects and interactions in the multilevel linear model. Emerging Adulthood, 3(6), 400-412.
  • Curran, P. J., Howard, A. L., Bainter, S. A., Lane, S., & McGinley, J. (2014). The separation of between- and within-person components of individual change over time: A latent curve model with structured residuals. Journal of Consulting and Clinical Psychology, 82(5), 879-894.
  • Bauer, D. J., Howard, A. L., Baldasaro, R., Curran, P. J., Hussong, A. M., Chassin, L., & Zucker, R. (2013). A tri-factor model for integrating ratings across multiple informants. Psychological Methods, 18(4), 475-493 . 

Matt McLarnon

Dr. Matthew McLarnon, PhD

Mount Royal University

Courses taught for CCRAM:
Latent Profile Analysis


Dr. Matthew McLarnon is an Associate Professor in Human Resource Management in the Bissett School of Business at Mount Royal University. Matt’s research is wide-ranging, but is most often centered around the substantive topics of employee attitudes, resiliency, and optimizing interpersonal dynamics in work teams. Matt’s research is also focused on the use and application of advanced analytical methods like mixture models and multilevel structural equation models, and longitudinal models. Matt’s work has been published in leading journals such as Organizational Research MethodsJournal of Applied PsychologyJournal of Management, Safety Science, Academy of Management Learning and Education, and Human Resource Management Review.

  • McLarnon, M.J.W., Gellatly, I.R., Richards, D.A., & Arazy, O. (in press). Knowledge sharing processes and the role of attachment patterns. Journal of Knowledge Management
  • Knight, C., McLarnon, M.J.W., Parker, S.K., & Wenzel, R. (2022). The importance of relational work design characteristics: A person-centred approach. Australian Journal of Management, 47, 705-728. 
  • McLarnon, M.J.W. (2022). Into the heart of darkness: A person-centered exploration of the Dark Triad. Personality and Individual Differences, 186, 111354. 

  • Lefsrud, L.M., McLarnon, M.J.W., & Gellatly, I.R. (2021). A pattern-oriented approach to safety climate: An empirical example. Safety Science, 142, 105385.
  • Dobson, K.S., McLarnon, M.J.W., Pandya, K., & Pusch, D. (2021). A latent profile analysis of adverse childhood experiences and adult health in a community sample. Child Abuse & Neglect, 114, 104927. 

  • McLarnon, M.J.W., & O’Neill, T.A. (2018). Extensions of auxiliary variable approaches for the investigation of mediation, moderation, and conditional effects in mixture models. Organizational Research Methods, 21, 955-982. 

  • O’Neill, T.A., McLarnon, M.J.W., Hoffart, G.C., Woodley, H.J., & Allen, N.J. (2018). The structure and function of team conflict state profiles. Journal of Management, 44, 811-836. 

  • O’Neill, T.A., Hoffart, G.C., McLarnon, M.J.W., Woodley, H.J., Eggermont, M., Rosehart, W., & Brennan, R. (2017). Constructive controversy and reflexivity training promotes effective conflict profiles and outcomes in student learning teams. Academy of Management Learning and Education, 17, 257-276. 

  • O’Neill, T.A., McLarnon, M.J.W., Xiu, L., & Law, S.J. (2016). Core self-evaluations, perceptions of group potency, and job performance: The moderating role of individualism and collectivism cultural profiles. Journal of Occupational and Organizational Psychology, 89, 447-473.

Cheryl Poth

Cheryl Poth

Dr. Cheryl Poth, PhD

University of Alberta

Courses taught for CCRAM:

Introduction to Mixed Methods Research


Dr. Cheryl Poth is a Professor in the Centre for Research and Applied Measurement and Evaluation in the Faculty of Education at the University of Alberta. Her specific research interests include enhancing research quality and interdisciplinary research teams. She is an award-winning instructor and author of three research-focused textbooks and editor of the forthcoming SAGE Handbook of Mixed Methods Research Design. Her books, the 4th edition of the Qualitative Inquiry & Research Design with John Creswell (2017, Sage), Innovation in Mixed Methods Research: Guiding Practices for Integrative Thinking with Complexity (2018, Sage), and Research Ethics (2021, Sage) are inspired by the practice dilemmas experienced in the field. In 2023, she received the Division D Significant Contributions to Research Methodologies Award from the American Educational Research Association for her work in multi and mixed methodologies.

She is a founding board member and served as fourth President of the Mixed Methods International Research Association. She has delivered mixed-methods and qualitative research keynotes, invited talks, short courses, and workshops on four continents and across Canada and the US. She has guest edited several special issues in the Journal of Mixed Methods Research and the International Journal of Qualitative Inquiry and authored works across a wide variety of journals and books in education, evaluation, research methods, medicine, business, and the health sciences.

  • Poth, C. (in press). The SAGE handbook of mixed methods research design (editor). Thousand Oaks, CA: Sage Publications
  • Poth, C. (2021). Research ethics. Thousand Oaks, CA: Sage Publication
  • Poth, C. (2018). Innovations in mixed methods research: Integrative thinking with complexity. Thousand Oaks, CA: Sage Publications
  • Creswell, J., & Poth, C. (2017). Qualitative inquiry & research design (4th ed.). Thousand Oaks, CA: Sage Publications
  • Poth, C., Bullock, E., & Eppel, E. (in press). Adaptive mixed methods research design practices to address complexity in business and management research. In R. Cameron & X. Golenko (Eds.) Handbook of Mixed Methods in Business and Management. Edward Elgar.

  • Poth, C. & Bullock, E. (in press). Mixed methods research design practices to address complexity in education. In R. Tierney, F. Rizvi, K. Ercikan, & G. Smith (Eds.), International Encyclopedia of Education (4th ed.). Elsevier.

  • Poth, C., Creamer, E., & Cain, L. (in press). Promising ethical practices for fully integrated mixed methods research. In R. Tierney, F. Rizvi, K. Ercikan, & G. Smith (Eds.), International Encyclopedia of Education (4th ed.). Elsevier. 

  • Gokiert, R., Poth, C., Kingsley, B., El Hassar, & Tink, L. (2022). Insights from a complex mixed methods community-based participatory design for responding to the evaluation capacity needs of the early childhood field. Canadian Journal of Program Evaluation. 36(3), 287-315

  • Poth, C., Bulut, O., Aquilina, A., & Otto, S. J. G. (2021). Using data mining for rapid complex case study descriptions: Examples of public health briefings during the onset of the COVID-19 pandemic. Journal of Mixed Methods Research, 15(3), 348-373.

  • Poth, C. (2022). Mixed Methods Integration in Times of Complexity. In J. Hitchcock & A. Onwuegbuzie (Eds.). Routledge Handbook for Advancing Integration in Mixed Methods Research, (pp. 169-191). Routledge.

  • Miller-Young, J., & Poth, C. (2021). ‘Complexifying’ our approach to assessing educational development outcomes: Bridging theoretical innovations with frontline practice. International Journal for Academic Development.     

  • Poth, C. (2020). Confronting complex problems with adaptive mixed methods research practices. Caribbean Journal of Mixed Methods Research, 1(1), 29-46.

  • Poth, C., & Searle, M., Aquilina, A., Ge, J. & Elder, A. (2020). Using the CIPP model to assess the impact of a competency-based approach to evaluation education: A mixed methods case study of the student experience in a Canadian graduate course, Special Collection of Papers on Evaluator Education, Evaluation and Program Planning, 79, 1–17

  • Poth, C. (2019). Realizing the integrative capacity of educational mixed methods research teams: Using a complexity-sensitive strategy to boost innovation, Mixed Methods special issue of the International Journal of Research and Method in Education, 42(3), 252-266

  • Poth, C. (2018). The contributions of mixed insights to advancing technology-enhanced formative assessments within higher education learning environments, International Journal of Educational Technology in Higher Education, 15(9). 1–19. 

  • Poth, C. (2014). What constitutes effective learning experiences in a mixed methods research course? An examination from the student perspective. International Journal of Multiple Research Approaches, 8(1), 74–86.

  • Poth, C. (2012). Exploring the role of mixed methods practitioner within educational research teams: A cross case comparison of the research planning process. Special issue of “Mixed Methods Research in Education” International Journal of Multiple Research Approaches, 6, 315–332. 

Dr. Jason Rights

Dr. Jason Rights, PhD

University of British Columbia

Courses taught for CCRAM:
Introduction to Multilevel Modeling


Dr. Jason Rights is an Assistant Professor of Quantitative Methods in the Department of Psychology at the University of British Columbia. His primary research focus is on addressing methodological complexities and developing statistical methods for multilevel/hierarchical data contexts (e.g., patients nested within clinicians, students nested within schools, or repeated measures nested within individuals). Specifically, he has recently been involved in several lines of research: (1) developing R-squared measures and methods for multilevel models; (2) addressing unappreciated consequences of conflating level-specific effects in analysis of multilevel data; (3) delineating relationships between multilevel models and other commonly used models, such as mixture models; and (4) advancing model selection and comparison methods for latent variable models. To aid researchers in applying his work, he develops software, primarily in R, that is openly available for public use.

  • Rights, J.D., & Sterba, S.K. (in press). R-squared measures for multilevel models with three or more levels. Multivariate Behavioral Research.
  • Rights, J.D., & Sterba, S.K. (2021). Effect size measures for longitudinal growth analyses: Extending a framework of multilevel model R-squareds to accommodate heteroscedasticity, autocorrelation, nonlinearity, and alternative centering strategies. New Directions for Child and Adolescent Development (Special Issue: Developmental Methods), 175, 65-110.
  • Rights, J.D., & Sterba, S.K. (2020). New recommendations on the use of R-squared differences in multilevel model comparisons. Multivariate Behavioral Research55, 568-599.
  • Rights, J.D., Preacher, K.J., & Cole, D.A. (2020). The danger of conflating level-specific effects of control variables when primary interest lies in level-2 effects. British Journal of Mathematical and Statistical Psychology, 73, 194-211.
  • Cole, D.A., Lu, R., Rights, J.D., Mick, C.R., Lubarsky, S.R., Gabruk, M.E., Lovette, A.J., Zhang, Y., Ford, M.A., Nick, E.A. (2020). Emotional and cognitive reactivity: Validating a multilevel modeling approach to daily diary data. Psychological Assessment, 32, 431-441.
  • Rights, J.D., & Cole, D.A. (2018). Effect size measures for multilevel models in clinical child and adolescent research: New R-squared methods and recommendations. Journal of Clinical Child & Adolescent Psychology, 47, 863-873.
  • Rights, J.D., & Sterba, S.K. (2019). Quantifying explained variance in multilevel models: An integrative framework for defining R-squared measures. Psychological Methods, 24, 309-338.       
  • Rights, J.D., Sterba, S.K., Cho, S.-J., & Preacher, K.J. (2018). Addressing model uncertainty in item response theory person scores through model averaging. Behaviormetrika45, 495-503.
  • Rights, J.D., & Sterba, S.K. (2018). A framework of R-squared measures for single-level and multilevel regression mixture models. Psychological Methods, 23, 434-457.
  • Sterba, S.K., & Rights, J.D. (2017). Effects of parceling on model selection: Parcel-allocation variability in model ranking. Psychological Methods, 22, 47-68.
  • Rights, J.D., & Sterba, S.K. (2016). The relationship between multilevel models and nonparametric multilevel mixture models: Discrete approximation of intraclass correlation, random coefficient distributions, and residuah heteroscedasticity. British Journal of Mathematical and Statistical Psychology69, 316-343.
  • Sterba, S.K., & Rights, J.D. (2016). Accounting for parcel-allocation variability in practice: Combining sources of uncertainty and choosing the number of allocations. Multivariate Behavioral Research, 51, 296-313.

Piers Steel

Dr. Piers Steel, PhD

University of Calgary

Courses taught for CCRAM:
Systematic Review and Meta-Analysis


Dr. Piers Steel is Professor and the Brookfield Research Chair at the Haskayne School of Business at the University of Calgary. Steel’s particular areas of research interest include culture, motivation and decision-making and he also has expertise in systematic review and meta-analysis. He is a member of the Society of Research Synthesis and Methodology, has published several methodology papers on how to improve meta-analysis and is a cofounder of the online meta-analytic platforms HubMeta and metaBUS.

Steel’s work has appeared in such places as the Journal of Personality and Social Psychology, Psychological Bulletin, and Personality and Social Psychology Review, Journal of Applied Psychology, Personnel Psychology and Academy of Management Review, among others. He is a fellow of the American Psychological Association, the Society of Industrial Organizational Psychology, and the American Psychological Society. His meta-analytic work has been reported globally in thousands of news articles and produced one best selling book.

Steel, P. (2012). The procrastination equation: How to stop putting things off and start getting stuff done. Harper Perennial.

  • Ogunfowora, B., Nguyen, V. Q., Steel, P., & Hwang, C. C. (2021). A meta-analytic investigation of the antecedents, theoretical correlates, and consequences of moral disengagement at work. Journal of Applied Psychology
  • Steel, P., Beugelsdijk, S., & Aguinis, H. (2021). The anatomy of an award-winning meta-analysis: Recommendations for authors, reviewers, and readers of meta-analytic reviews. Journal of International Business Studies, 52, 23-44
  • Steel, P., Schmidt, J., Bosco, F., & Uggerslev, K. (2019). The effects of personality on job satisfaction and life satisfaction: A meta-analytic investigation accounting for bandwidth-fidelity and commensurability. Human Relations, 72, 217–247
  • Doucouliagos, C., Stanley, T. & Steel, P. (2018). Does ICT generate economic growth? A meta-regression analysis. Journal of Economic Surveys, 32, 705-726
  • Zeng, R., Grogaard, B., & Steel, P. (2018). Complements or substitutes? A meta-analysis of the role of integration mechanisms in knowledge transfer in the MNE Network. Journal of World Business, 53, 415-432
  • Steel, P., Taras, V., Uggerslev, K., & Bosco, F. (2018). The happy culture: A meta-analytic review and empirical investigation of culture’s relationship with subjective wellbeing. Personality and Social Psychology Review, 22, 128-169
  • Lee, C., Bosco, F., Steel, P., & Uggerslev, K. (2017). A metaBUS enabled meta-analysis of career satisfaction. Career Development International, 22, 565-582.
  • Simmons, S., Caird, J. & Steel, P. (2017). A meta-analysis of in-vehicle and nomadic voice recognition system interaction and driving performance. Accident Analysis and Prevention, 106, 21-43
  • Bosco, F., Uggerslev, K., & Steel, P. (2017). metaBUS as a vehicle for facilitating meta-analysis. Human Resource Management Review, 27, 237-254
  • Paterson, T. A., Harms, P. D., Steel, P. & Credé, M. (2016). An assessment of the magnitude of effect sizes: Evidence from 30 years of meta-analysis in management. Journal of Leadership and Organizational Studies, 23, 66-81
  • Bosco, F., Steel, P., Oswald, F. L., Uggerslev, K., & Field, J. G. (2015). Cloud-based meta-analysis to bridge science and practice: Welcome to metaBUS. Personnel Assessment and Decisions, 1. Article 2.
  • Steel, P., Kammeyer-Mueller, J., & Paterson, T. (2015). Improving the meta-analytic assessment of effect size variance with an informed Bayesian prior. Journal of Management, 41, 718-743.
  • Caird, J. K., Johnston, K. A., Willness, C. R., Asbridge, M., & Steel, P. (2014). A meta-analysis of the effects of texting on driving. Accident Analysis & Prevention, 71, 311-318
  • Merkin, R., Taras, V., & Steel, P. (2014). State of the art themes in cross-cultural communication research: A meta-analytic review. International Journal of Intercultural Relations, 38, 1-23
  • Liu, X., Vredenburg, H. & Steel, P. (2014). A meta-analysis of factors leading to management control in international joint ventures. Journal of International Management, 20, 219-236
  • Taras, V., Steel, P., & Kirkman, B. (2012). Improving national cultural indices using a longitudinal meta-analysis of Hofstede's dimensions. Journal of World Business, 47, 329-334
  • Steel, P., & Taras, V. (2010). Culture as a consequence: A multilevel multivariate meta-analysis of the effects of individual and country characteristics on work-related cultural values. Journal of International Management, 16, 211-233
  • Kammeyer-Mueller, J., Steel, P., & Rubenstein, A. (2010). The other side of method bias: The perils of distinct source research designs. Multivariate Behavioral Research, 45, 294-321
  • Bowen, F., Rostami, M., & Steel, P. (2010). Timing is everything: A meta-analysis of the relationships between organizational performance and innovation. Journal of Business Research, 63, 1179–1185
  • Steel, P., & Kammeyer-Mueller, J. (2009). Using a meta-analytic perspective to enhance Job Component Validation. Personnel Psychology, 62, 533–552
  • Caird, J., Willness, C. R., Steel, P., & Scialfa, C. (2008). A meta-analysis of the effects of cell phones on driver performance. Accident Analysis & Prevention, 40, 1282-1293.
  • Steel, P., & Kammeyer-Mueller, J. (2008). Bayesian variance estimation for meta-analysis: Quantifying our uncertainty. Organizational Research Methods, 11, 54-78
  • Steel, P. (2007). The nature of procrastination: A meta-analytic and theoretical review of quintessential self-regulatory failure. Psychological Bulletin, 133, 65-94.
  • Willness, C., Steel, P., & Lee, K. (2007). A meta-analysis of the antecedents and consequences of workplace sexual harassment. Personnel Psychology, 60, 127-162.
  • Steel, P. & Kammeyer-Mueller, J. (2002). Comparing meta-analytic moderator search techniques under realistic conditions. Journal of Applied Psychology, 87, 96-111.
Calling all researchers

Calling all methodologists

If you are a Canada-affiliated expert interested in bringing your passion and expertise in design and analysis to others through CCRAM, send us your information and we will be in touch to see how we can work together.