About - CCRAM

Meet our experts

At CCRAM, you will learn directly from methodology experts in data analysis, design, and methods

The Canadian Centre for Research Analysis and Methods (CCRAM) is the preeminent Canadian destination for academics and researchers in government and industry anywhere in the world to learn from Canada’s leading behavioural science methodologists. Among its many services, CCRAM offers online and in-person seminars from Canada-affiliated researchers and educators who focus their research on the development and dissemination of methodological knowledge and tools useful to the behavioural scientist. CCRAM seminars cater to scientists across various disciplines – social sciences, health, business, or elsewhere in academia, government or industry – who want to learn directly from methodology experts in data analysis, design, and methods.

Meet our experts

Andrew F. Hayes, Ph.D.

Dr. Andrew F. Hayes, PhD

University of Calgary

Courses taught for CCRAM:
Mediation, Moderation, and Conditional Process Analysis


Dr. Andrew Hayes, PhD brings his passion for quantitative methodology to researchers as the Director of the Canadian Centre for Research Analysis and Methods (CCRAM).

Truly and expert in the field, Hayes invented the PROCESS macro for SPSS, SAS and R, widely used by researchers examining the mechanisms and contingencies of effects. He has written extensively on research methodology including Introduction to Mediation, Moderation, and Conditional Process Analysis (2022), Regression Analysis and Linear Models (2017) and Statistical Methods for Communication Science (2005).

Within the centre, he teaches courses on applied data analysis and also conducts online and in-person workshops on statistical analysis to multidisciplinary audiences throughout the world, most frequently to faculty and graduate students in business schools but also in education, psychology, social work, communication, public health and government researchers

He is currently a Distinguished Research Professor at the Haskayne School of Business at the University of Calgary with an adjunct appointment in the Department of Psychology. He also . His work has been cited well over 140,000 times according to Google Scholar and he has been designated a Highly Cited Researcher by Clarivate Analytics in 2019, 2020, and 2021.

  • Hayes, A. F. (2022). Introduction to mediation, moderation, and conditional process analysis: A regression-based approach (3rd edition). New York: The Guilford Press.
  • Hayes, A. F. (2017). Regression analysis and linear models: Concepts, applications, and implementation. New York: The Guilford Press.
  • Hayes, A. F. (2005). Statistical methods for communication science. Routlege/Taylor and Francis.
  • Igartua, J.-J., & Hayes, A. F. (2021). Mediation, moderation, and conditional process analysis: Concepts, computations, and some common confusions. Spanish Journal of Psychology, 24, e49.
  • Hayes, A. F., & Rockwood, N. J. (2020). Conditional process analysis: Concepts, computations, and advances in the modeling of the contingencies of mechanisms. American Behavioral Scientist, 64, 19-54.
  • Coutts, J. J., Hayes, A. F., & Jiang, T. (2019). Easy statistical mediation analysis with distinguishable dyadic data. Journal of Communication, 69, 612-649.
  • Hayes, A. F. (2018). Partial, conditional, and moderated moderated mediation: Quantification, inference, and interpretation. Communication Monographs, 85, 4-40.
  • Hayes, A. F., & Rockwood, N. J. (2017). Regression-based statistical mediation and moderation analysis in clinical research: Observations, recommendations, and implementation. Behaviour Research and Therapy, 98, 39-57.
  • Hayes, A. F., Montoya, A. K., & Rockwood, N. J. (2017). The analysis of mechanisms and their contingencies: PROCESS versus structural equation modeling. Australasian Marketing Journal, 25, 76-81.
  • Hayes, A. F., & Montoya, A. K. (2017). A tutorial on testing, visualizing, and probing interaction involving a multicategorical variable in linear regression analysis. Communication Methods, and Measures, 11, 1-30.
  • Montoya, A. K., & Hayes, A. F. (2017). Two condition within-participant statistical mediation analysis: A path-analytic framework. Psychological Methods, 22, 6-27.
  • Hayes, A. F. (2015). An index and test of linear moderated mediation. Multivariate Behavioral Research, 50, 1-22.
  • Hayes, A. F. (2014). Statistical mediation analysis with a multicategorical independent variable. British Journal of Mathematical and Statistical Psychology, 67, 451-470.
  • Hayes, A. F., & Scharkow, M. (2013). The relative trustworthiness of inferential tests of the indirect effect in statistical mediation analysis: Does method really matter? Psychological Science, 24, 1918-1927.
  • Hayes, A. F., & Preacher, K. J. (2010). Estimating and testing indirect effects in simple mediation models with the constituent paths are nonlinear. Multivariate Behavioral Research, 45, 627-660.
  • Hayes, A. F. (2009). Beyond Baron and Kenny: Statistical mediation analysis in the new millennium. Communication Monographs, 76, 408-420.
  • Preacher, K. J., & Hayes, A. F. (2008). Asymptotic and resampling strategies for assessing and comparing indirect effects in multiple mediator models. Behavior Research Methods, 40, 879-891.

Felix Cheung

Dr. Felix Cheung, PhD

University of Toronto

Courses taught for CCRAM:
Doing Open and Replicable Science


Dr. Felix Cheung, PhD joins CCRAM to teach Doing Open and Replicable Science. Cheung is focused on promoting population well-being based on sound, empirical research. Currently an Assistant Professor at the University of Toronto in the Department of Psychology, Cheung researches the determinants and consequences of subjective well-being across diverse populations. He also studies meta-science - the scientific study of science. Cheung examines how the reliability of scientific findings can be potentially improved by 'Big Science' (studies done by large collaborative teams), open science practices (pre-registration and data sharing) and research incentives.

Cheung’s research is timely as it is focused on addressing pressing global issues such as sociopolitical unrest, income inequality and terrorism.

  • Landy, J. F., Jia, M., Ding, I. L., Viganola, D., Tierney, W., Dreber, A…, Cheung, F., ... Uhlmann, E. L. (2020). Crowdsourcing hypothesis tests: Making transparent how design choices shape research results. Psychological Bulletin, 146(5), 451-479.
  • Silberzahn, R., Uhlmann, E.L., Martin, D.P., Anselmi, P., Aust, F., Awtrey, E., Cheung, F., … Nosek, B. A. (2018). Many analysts, one dataset: Making transparent how variations in analytical choices affect results. Advances in Methods and Practices in Psychological Science, 1, 337-356.
  • Anderson, C. J., Bahník, Š., Barnett-Cowan, M., Bosco, F. A., Chandler, J., Chartier, C. R.,  … Cheung, F., …, & Zuni, K. (2016) Response to a comment on “Estimating the reproducibility of psychological science”. Science, 351(6277), 1037
  • Schweinsberg, M., Madan, N., Vianello, M., Sommer, S. A., Jordan, J., Tierney, W., Awtrey, E., Zhu, L., … Cheung, F., … , & Uhlmann, E. L. (2016). The pipeline project: Pre-publication independent replications of a single laboratory's research pipeline. Journal of Experimental Social Psychology, 66, 55-67.
  • Tierney, W., Schweinsberg, M., Jordan, J., Kennedy, D. M., Qureshi, I., Sommer, S.A., … Cheung, F., …, & Uhlmann, E. L. (2016). Data from a pre-publication independent replication initiative: The pipeline project. Scientific Data, 3, 160082.
  • Open Science Collaboration. (2015). Estimating the reproducibility of psychological science. Science, 349(6251).
  • Johnson, D.J., Cheung, F., & Donnellan, M.B. (2014). Hunting for artifacts: The perils of dismissing inconsistent replication results. Social Psychology, 45, 318-320.

Hadi Farborzi

Dr. Hadi Fariborzi, PhD

Mount Royal University

Courses taught for CCRAM:
Systematic Review and Meta-Analysis


Dr. Fariborzi received his PhD in Strategy and Global Management from the Haskayne School of Business, University of Calgary. He is now an Assistant Professor of Innovation and Entrepreneurship at Mount Royal University. Hadi’s primary areas of research are international entrepreneurship, growth of small/new ventures and national cultures with a particular focus on systematic reviews and meta-analysis. Hadi co-founded the online meta-analytic platform HubMeta and has managed multiple teams of research assistants in different meta-analysis projects globally. He has thought multiple meta-analysis/systematic review courses and workshops around the world. His work has been published in journals such as the Journal of International Business Studies, Journal of World Business, and Small Business Economics

Jessica Flake

Dr. Jessica Flake, PhD

McGill University

Courses taught for CCRAM:
Scale Development and Psychometrics


Dr. Jessica Flake, PhD brings her expertise to CCRAM by teaching Scale Development and Psychometrics. Flake was named an Association of Psychological Science Rising Star in 2021 and received a Society for the Improvement of Psychological Science Commendation in 2020 for her research into questionable measurement practices.

She is an Assistant Professor of Quantitative Psychology and Modelling at McGill University, regularly teaching measurement and statistics courses as well as workshops at international conferences.

Her work focuses on technical and applied aspects of psychological measurement including scale development, psychometric modelling and scale use and replicability and is published in top journals such as Nature: Human Behavior, Psychological Methods, Advances in Methods and Practices in Psychological Science, Structural Equation Modeling, Psychological Science and the Journal of Personality and Social Psychology.

She also works in applied psychometrics as a technical advisory panel member for the Enrollment management Association, a non-profit that develops educational assessments, and serves as the Assistant Director for Methods at the Psychological Science Accelerator, a laboratory network that conducts large-scale studies.

  • Luong, R. & Flake, J.K. (in press). Measurement invariance testing using confirmatory factor analysis and alignment optimization: A tutorial for transparent analysis planning and reporting. Psychological Methods.
  • Flake, J. K., Shaw, M., & Luong, R. (in press). Addressing a crisis of generalizability with large-scale construct validation. Behavioral and Brain Sciences.
  • Flake, J.K. (2021). Strengthening the foundation of educational psychology by integrating construct validation into open science reform. Educational Psychologist. 56, 132-141.
  • Beymer, P.N., Ferland, M., & Flake, J.K. (2021). Validity evidence for a short scale of college students’ perceptions of cost. Current Psychology, 1-20.
  • Hwang, H., Cho, G., Jung, K., Falk, C., Flake, J.K., & Jin, M. (2021). An approach to structural equation modeling with both factors and components: Integrated generalized structured component analysis. Psychological Methods, 26, 273–294
  • Flake, J.K., & Fried, E.I. (2020). Measurement schmeasurement: Questionable measurement practices and how to avoid them. Advances in Methods and Practices in Psychological Science, 3, 456-465.
  • Shaw, M., Cloos, L., Luong, R., Elbaz, S. & Flake, J.K. (2020). Measurement practices in large-scale replications: Insights from Many Labs 2. Canadian Psychology, 61, 289-298.
  • Hehman, E., Calanchini, J., Flake, J. K., & Leitner, J. B. (2019). Establishing construct validity evidence for regional measures of explicit and implicit racial bias.  Journal of Experimental Psychology: General. 148 (6) 1022-1040.
  • Flake, J.K., & McCoach, D.B. (2018). An investigation of the alignment method with polytomous indicators under conditions of partial measurement invariance. Structural Equation Modeling: A Multidisciplinary Journal, 25 (1), 56-70.
  • Flake, J.K., Pek, J., & Hehman, E. (2017). Construct validation in social and personality research: Current practice and recommendations. Social Psychological and Personality Science, 8, 370-378
  • Flora, D. & Flake, J.K. (2017). The purpose and practice of exploratory and confirmatory factor analysis in psychological research: Decisions for scale development and validation. Canadian Journal of Behavioural Science, 49, 78-88.
  • Goldstein, J., & Flake, J.K. (2016). Towards a framework for the validation of early childhood assessment systems. Educational Assessment, Evaluation and Accountability, 23, 273-293 .
  • Flake, J. K., Barron, K. E., Hulleman, C., McCoach, B. D., & Welsh, M. E. (2015). Measuring cost: The forgotten component of expectancy-value theory. Contemporary Educational Psychology, 41, 232–244.

Jenny Godley photo

Dr. Jenny Godley, PhD

University of Calgary

Courses taught for CCRAM:
Introduction to Social Network Analysis


Dr. Jenny Godley, PhD brings a unique lens to CCRAM which is evident in her teaching of Introduction to Social Network Analysis. Godley is an Associate Professor in the Department of Sociology and an Adjunct Associate Professor in the Department of Community Health Sciences, Cumming School of Medicine at the University of Calgary. Trained as a quantitative sociologist, she uses demographic and social network analytic techniques to examine the processes which lead to and perpetuate social inequalities in health. She has experience analyzing large, population-level datasets and administrative data and collecting and analyzing both whole and ego-centred social network data. She is a popular undergraduate teacher and an award-winning graduate supervisor.

Godley is also the Chair of the Conjoint Faculties Research Ethics Board at the University of Calgary and has a particular interest in the ethical issues which arise when collecting social network data.

Haan, M., & Godley, J. (2016). An introduction to statistics for Canadian social scientists (3rd Ed). Oxford University Press.

  • Stearns, J.A., Godley, J., Veuglers, P.J., Ekwaru, J.P., Bastian, K., Wu, B., and Spence, J.C. (2019). Association of friendships and children’s physical activity during and outside of school: A social network study. Social Science and Medicine – Population Health, 7.
  • McIntyre, L., Jessiman-Perreault, G., Mah, C. and Godley, J. (2018). A socialnetwork analysis of Canadian food insecurity policy actors. Canadian  Journal of Dietetic Practice and Research. 79(2):60-66.
  • Godley, J., Glenn, N.M., Sharma, A.M. and Spence, J.C. (2014). Networks of Trainees: Examining the effects of attending an interdisciplinary research training camp on the careers of new obesity scholars. Journal of Multidisciplinary Healthcare. 7:459-470.
  • Godley, J., Sharkey, K.A. and Weiss, S. (2013). Networks of Neuroscientists: Professional interactions within an interdisciplinary brain research institute. Journal of Research Administration. 44(2):94-126.
  • Godley, J., Barron, G. and Sharma, A. (2011). Using social network analysis to assess collaboration in health research. The Journal of Healthcare, Science and the Humanities. 1(2):99-116.
  • Haines, V.A., Godley, J. and Hawe, P. (2011). Understanding interdisciplinary collaborations as social networks.  American Journal of Community Psychology. 47:1-11.
  • Fur, R., Henderson, E.A. Read, R.R., Godley, J., Roy, C. and Bush, K. (2011). The use of social network analysis to quantify the importance of sex partner meeting venues in an infectious syphilis outbreak in Alberta, Canada. Sexually Transmitted Infections. 87: A164-A165.
  • Godley, J. and Russell-Mayhew, S. (2010). Interprofessional relationships in the field of obesity: Data from Canada. Journal of Research in Interprofessional Practice and Education. 1(2):88-108.
  • Godley, J. (2008). Preference or propinquity?  The relative influence of selection and opportunity on friendship homophily in college. Connections. 28(2):65-80.
  • Godley, J. (2001). The influence of sibling ties on women’s contraceptive method choice in Nang Rong, Thailand. International Family Planning Perspectives. 27(1):4-10. 

Murtaza Haider

Dr. Murtaza Haider, PhD

Toronto Metropolitan University


Dr. Murtaza Haider, PhD is a professor of Data Science and Real Estate Management at Toronto Metropolitan University. He also serves as research director of the Urban Analytics Institute and Director of Regionomics Inc., a boutique consulting firm specializing in the economics of cities and regions. He holds an adjunct professorship in Engineering at McGill University. His research interests include business analytics, data science, housing market dynamics, transport/infrastructure/urban planning, and human developing in Canada and South Asia. He is a syndicated columnist with Post Media and writes weekly columns on real estate markets that appear nationally in various Canadian news outlets.

Haider, M. (2016). Getting started with data science: Making sense of data with analytics. IBMBooks.

Dr. Rex Kline

Dr. Rex Kline, PhD

Concordia University

Courses taught for CCRAM:
Structural Equation Modeling Done Right


Dr. Rex Kline, PhD brings his expertise with structural equation modeling to his course, Structural Equation Modeling Done Right, with CCRAM.

Kline is the author of Principles and Practice of Structural Equation Modeling, which through four editions, has been one of the widely cited introductory-level text books in the area. He has conducted research on the psychometric evaluation of cognitive abilities, behavioral and scholastic assessment of children, structural equation modeling, training of researchers, statistics reform in the behavioral sciences and usability engineering in computer science. Kline has used his expertise to help revise journal article reporting standards for quantitative studies and introduced updated reporting standards for SEM studies as a member of the Publications and Communications Board Task Force of the American Psychological Association. He is currently a Professor in the Department of Psychology at Concordia University in Montréal, Québec, Canada.

  • Kline, R. B. (2015). Principles and practice of structural equation modeling (4th Edition). The Guilford Press.
  • Kline, R. B. (2019). Becoming a behavioral science researcher (2nd Edition). The Guilford Press.
  • Kline, R. B. (2013). Beyond significance testing: Statistics reform in the behavioral sciences (2nd Edition). American Psychological Association.
  • Kline, R. B. (in press). Assumptions in structural equation modeling. In R. Hoyle (Ed.), Handbook of structural equation modeling (2nd ed.). Guilford Press.
  • Kline, R. B. (in press). Questionable practices in statistical analysis. In H. Cooper, M. Coutanche, L. McMullen, A. Panter, D. Rindskopf, & K. J. Sher (Eds.), APA handbook of research methods in psychology (2nd ed.). American Psychological Association.
  • Kline, R. B. (in press). Structural equation modeling. In A. Nichols & J. Edlund (Eds.), Cambridge handbook of research methods and statistics for the social and behavioral sciences. Cambridge University Press.
  • Kline, R. B. (in press). Structural equation modeling. In R. Tierney, F. Rizvi, K. Ercikan, & G. Smith (Eds.), International encyclopedia of education (4th ed.). Elsevier.
  • Kline, R. B. (in press). Structural equation modeling in neuropsychology research. In G. Brown, B. Crosson, K. Haaland, & T. King (Eds.), APA handbook of neuropsychology. Washington DC: American Psychological Association.
  • Zhang, M. F., Dawson, J., & Kline, R. B. (2021). Evaluating the use of covariance-based structural equation modelling with reflective measurement in organisational and management research: A review and recommendations for best practice. British Journal of Management, 32, 257–272.
  • Kline, R. B. (2020). Post p-value education in graduate statistics: Preparing tomorrow’s psychology researchers for a post-crisis future. Canadian Psychology, 61, 331-341.
  • Appelbaum, M., Cooper, H., Kline, R. B., Mayo-Wilson, E., Nezu, A. M., & Rao, S. M. (2018). Journal article reporting standards for quantitative research in psychology: The APA Publications and Communications Board Task Force report. American Psychologist73, 3–25.
  • Goodboy, A. K., & Kline, R. B. (2017). Statistical and practical concerns with published communication research featuring structural equation modeling. Communication Research Reports34, 1–10.
  • Kline, R. B. (2015). The mediation myth. Basic and Applied Social Psychology, 37, 202–213.

Dr. Jason Rights

Dr. Jason Rights, PhD

University of British Columbia

Courses taught for CCRAM:
Introduction to Multilevel Modeling


Dr. Jason Rights, PhD brings his expertise in working with multilevel and hierarchical data to CCRAM students in his course, Introduction to Multilevel Modeling. Rights is currently an Assistant Professor of Quantitative Methods in the Department of Psychology at the University of British Columbia. His primary research focus is on addressing methodological complexities and developing statistical methods for multilevel/hierarchical data contexts (e.g., patients nested within clinicians, students nested within schools, or repeated measures nested within individuals). Specifically, he has recently been involved in several lines of research: (1) developing R-squared measures and methods for multilevel models; (2) addressing unappreciated consequences of conflating level-specific effects in analysis of multilevel data; (3) delineating relationships between multilevel models and other commonly used models, such as mixture models; and (4) advancing model selection and comparison methods for latent variable models. To aid researchers in applying his work, he develops software, primarily in R, that is openly available for public use.

  • Rights, J.D., & Sterba, S.K. (in press). R-squared measures for multilevel models with three or more levels. Multivariate Behavioral Research.
  • Rights, J.D., & Sterba, S.K. (2021). Effect size measures for longitudinal growth analyses: Extending a framework of multilevel model R-squareds to accommodate heteroscedasticity, autocorrelation, nonlinearity, and alternative centering strategies. New Directions for Child and Adolescent Development (Special Issue: Developmental Methods), 175, 65-110.
  • Rights, J.D., & Sterba, S.K. (2020). New recommendations on the use of R-squared differences in multilevel model comparisons. Multivariate Behavioral Research55, 568-599.
  • Rights, J.D., Preacher, K.J., & Cole, D.A. (2020). The danger of conflating level-specific effects of control variables when primary interest lies in level-2 effects. British Journal of Mathematical and Statistical Psychology, 73, 194-211.
  • Cole, D.A., Lu, R., Rights, J.D., Mick, C.R., Lubarsky, S.R., Gabruk, M.E., Lovette, A.J., Zhang, Y., Ford, M.A., Nick, E.A. (2020). Emotional and cognitive reactivity: Validating a multilevel modeling approach to daily diary data. Psychological Assessment, 32, 431-441.
  • Rights, J.D., & Cole, D.A. (2018). Effect size measures for multilevel models in clinical child and adolescent research: New R-squared methods and recommendations. Journal of Clinical Child & Adolescent Psychology, 47, 863-873.
  • Rights, J.D., & Sterba, S.K. (2019). Quantifying explained variance in multilevel models: An integrative framework for defining R-squared measures. Psychological Methods, 24, 309-338.       
  • Rights, J.D., Sterba, S.K., Cho, S.-J., & Preacher, K.J. (2018). Addressing model uncertainty in item response theory person scores through model averaging. Behaviormetrika45, 495-503.
  • Rights, J.D., & Sterba, S.K. (2018). A framework of R-squared measures for single-level and multilevel regression mixture models. Psychological Methods, 23, 434-457.
  • Sterba, S.K., & Rights, J.D. (2017). Effects of parceling on model selection: Parcel-allocation variability in model ranking. Psychological Methods, 22, 47-68.
  • Rights, J.D., & Sterba, S.K. (2016). The relationship between multilevel models and nonparametric multilevel mixture models: Discrete approximation of intraclass correlation, random coefficient distributions, and residuah heteroscedasticity. British Journal of Mathematical and Statistical Psychology69, 316-343.
  • Sterba, S.K., & Rights, J.D. (2016). Accounting for parcel-allocation variability in practice: Combining sources of uncertainty and choosing the number of allocations. Multivariate Behavioral Research, 51, 296-313.

Piers Steel

Dr. Piers Steel, PhD

University of Calgary

Courses taught for CCRAM:
Systematic Review and Meta-Analysis


Dr. Piers Steel, PhD has researched how to improve meta-analysis and brings this knowledge to CCRAM students in his course, Systematic Review and Meta-Analysis. Steel is Professor and the Brookfield Research Chair at the Haskayne School of Business at the University of Calgary. Steel’s particular areas of research interest include culture, motivation and decision-making and he also has expertise in systematic review and meta-analysis. He is a member of the Society of Research Synthesis and Methodology, has published several methodology papers on how to improve meta-analysis and is a cofounder of the online meta-analytic platforms HubMeta and metaBUS.

Steel’s work has appeared in such places as the Journal of Personality and Social Psychology, Psychological Bulletin, and Personality and Social Psychology Review, Journal of Applied Psychology, Personnel Psychology and Academy of Management Review, among others. He is a fellow of the American Psychological Association, the Society of Industrial Organizational Psychology, and the American Psychological Society. His meta-analytic work has been reported globally in thousands of news articles and produced one best selling book.

Steel, P. (2012). The procrastination equation: How to stop putting things off and start getting stuff done. Harper Perennial.

  • Ogunfowora, B., Nguyen, V. Q., Steel, P., & Hwang, C. C. (2021). A meta-analytic investigation of the antecedents, theoretical correlates, and consequences of moral disengagement at work. Journal of Applied Psychology
  • Steel, P., Beugelsdijk, S., & Aguinis, H. (2021). The anatomy of an award-winning meta-analysis: Recommendations for authors, reviewers, and readers of meta-analytic reviews. Journal of International Business Studies, 52, 23-44
  • Steel, P., Schmidt, J., Bosco, F., & Uggerslev, K. (2019). The effects of personality on job satisfaction and life satisfaction: A meta-analytic investigation accounting for bandwidth-fidelity and commensurability. Human Relations, 72, 217–247
  • Doucouliagos, C., Stanley, T. & Steel, P. (2018). Does ICT generate economic growth? A meta-regression analysis. Journal of Economic Surveys, 32, 705-726
  • Zeng, R., Grogaard, B., & Steel, P. (2018). Complements or substitutes? A meta-analysis of the role of integration mechanisms in knowledge transfer in the MNE Network. Journal of World Business, 53, 415-432
  • Steel, P., Taras, V., Uggerslev, K., & Bosco, F. (2018). The happy culture: A meta-analytic review and empirical investigation of culture’s relationship with subjective wellbeing. Personality and Social Psychology Review, 22, 128-169
  • Lee, C., Bosco, F., Steel, P., & Uggerslev, K. (2017). A metaBUS enabled meta-analysis of career satisfaction. Career Development International, 22, 565-582.
  • Simmons, S., Caird, J. & Steel, P. (2017). A meta-analysis of in-vehicle and nomadic voice recognition system interaction and driving performance. Accident Analysis and Prevention, 106, 21-43
  • Bosco, F., Uggerslev, K., & Steel, P. (2017). metaBUS as a vehicle for facilitating meta-analysis. Human Resource Management Review, 27, 237-254
  • Paterson, T. A., Harms, P. D., Steel, P. & Credé, M. (2016). An assessment of the magnitude of effect sizes: Evidence from 30 years of meta-analysis in management. Journal of Leadership and Organizational Studies, 23, 66-81
  • Bosco, F., Steel, P., Oswald, F. L., Uggerslev, K., & Field, J. G. (2015). Cloud-based meta-analysis to bridge science and practice: Welcome to metaBUS. Personnel Assessment and Decisions, 1. Article 2.
  • Steel, P., Kammeyer-Mueller, J., & Paterson, T. (2015). Improving the meta-analytic assessment of effect size variance with an informed Bayesian prior. Journal of Management, 41, 718-743.
  • Caird, J. K., Johnston, K. A., Willness, C. R., Asbridge, M., & Steel, P. (2014). A meta-analysis of the effects of texting on driving. Accident Analysis & Prevention, 71, 311-318
  • Merkin, R., Taras, V., & Steel, P. (2014). State of the art themes in cross-cultural communication research: A meta-analytic review. International Journal of Intercultural Relations, 38, 1-23
  • Liu, X., Vredenburg, H. & Steel, P. (2014). A meta-analysis of factors leading to management control in international joint ventures. Journal of International Management, 20, 219-236
  • Taras, V., Steel, P., & Kirkman, B. (2012). Improving national cultural indices using a longitudinal meta-analysis of Hofstede's dimensions. Journal of World Business, 47, 329-334
  • Steel, P., & Taras, V. (2010). Culture as a consequence: A multilevel multivariate meta-analysis of the effects of individual and country characteristics on work-related cultural values. Journal of International Management, 16, 211-233
  • Kammeyer-Mueller, J., Steel, P., & Rubenstein, A. (2010). The other side of method bias: The perils of distinct source research designs. Multivariate Behavioral Research, 45, 294-321
  • Bowen, F., Rostami, M., & Steel, P. (2010). Timing is everything: A meta-analysis of the relationships between organizational performance and innovation. Journal of Business Research, 63, 1179–1185
  • Steel, P., & Kammeyer-Mueller, J. (2009). Using a meta-analytic perspective to enhance Job Component Validation. Personnel Psychology, 62, 533–552
  • Caird, J., Willness, C. R., Steel, P., & Scialfa, C. (2008). A meta-analysis of the effects of cell phones on driver performance. Accident Analysis & Prevention, 40, 1282-1293.
  • Steel, P., & Kammeyer-Mueller, J. (2008). Bayesian variance estimation for meta-analysis: Quantifying our uncertainty. Organizational Research Methods, 11, 54-78
  • Steel, P. (2007). The nature of procrastination: A meta-analytic and theoretical review of quintessential self-regulatory failure. Psychological Bulletin, 133, 65-94.
  • Willness, C., Steel, P., & Lee, K. (2007). A meta-analysis of the antecedents and consequences of workplace sexual harassment. Personnel Psychology, 60, 127-162.
  • Steel, P. & Kammeyer-Mueller, J. (2002). Comparing meta-analytic moderator search techniques under realistic conditions. Journal of Applied Psychology, 87, 96-111.
Calling all researchers

Calling all methodologists

If you are a Canada-affiliated expert interested in bringing your passion and expertise in design and analysis to others through CCRAM, send us your information and we will be in touch to see how we can work together.