This is an open-access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
This article has been cited by other articles in ScienceCentral.
Abstract
Purpose
The field of physical therapy education is seeking an evidence-based approach for admitting qualified applicants, as previous research has assessed various outcomes, impeding practical application. This study was conducted to identify preadmission criteria predictive of graduation success.
Methods
Data from the 2013–2016 graduating cohorts (n=149) were collected. Predictors included verbal Graduate Record Examination rank percentile (VGRE%), quantitative GRE rank percentile, analytical GRE rank percentile, the admissions interview, precumulative science grade point average (SGPA), precumulative grade point average (UGPA), and a reflective essay. The National Physical Therapy Examination (NPTE) and grade point average at the time of graduation (GGPA) were used as measures of graduation success. Two separate mixed-effects models determined the associations of preadmission predictors with NPTE performance and GGPA.
Results
The NPTE model fit comparison showed significant results (degrees of freedom [df]=10, P=0.001), decreasing within-cohort variance by 59.5%. NPTE performance was associated with GGPA (β=125.21, P=0.001), and VGRE%, the interview, the essay, and GGPA (P≤0.001) impacted the model fit. The GGPA model fit comparison did not show significant results (df=8, P=0.56), decreasing within-cohort variance by 16.4%. The GGPA was associated with the interview (β=0.02, P=0.04) and UGPA (β=0.25, P=0.04), and VGRE%, the interview, UGPA, and the essay (P≤0.02) impacted model fit.
Conclusion
In our findings, GGPA predicted NPTE performance, and the interview and UGPA predicted GGPA. Unlike past evidence, SGPA showed no predictive power. The essay and VGRE% warrant attention because of their influence on model fit. We recommend that admissions ranking matrices place a greater weight on the interview, UGPA, VGRE%, and the essay.
Multiple preadmission predictors have been studied by physical therapy (PT) and PT assistant education programs. Such predictors have included age [1], gender [1,2], entry-level degree [1,3], Graduate Record Examination (GRE) [2], prerequisite grade point average (GPA) [1], precumulative science grade point average (SGPA) [2,3], precumulative grade point average (UGPA) [1-3], essays [2], and letters of recommendation [2]. These preadmission predictors have been studied with reference to a variety of outcomes, including GPA after the first professional year [1], admission [2], and National PT Examination (NPTE) performance [3].
Past evidence has demonstrated certain associations between preadmission predictors and various outcomes. Jones et al. [4] in 2014 reviewed the admissions processes of PT and physician assistant programs and noted UGPA to be the best predictor of academic success. Ruscingno et al. [1] in 2010 found that UGPA was correlated with the basic sciences GPA after the first professional year [1]. Nuciforo et al. [2] in 2014 found that SGPA was the strongest predictor of PT program admission. In an analysis of PT programs at the University of Tennessee at Chattanooga and University of North Dakota, no difference was found in the NPTE performance of individuals with differing entry-level degrees, defined as 3 versus 4 years of preprofessional coursework [3]. The field of PT education is still seeking an evidence-based approach for admitting qualified applicants. A considerable amount of research exists; however, previous studies analyzed multiple distinct outcomes, impeding practical application. This comprehensive study will add to the available literature by using multiple preadmission predictors, with NPTE performance and GPA at the time of graduation (GGPA) as outcomes of graduation success.
The aim of this study was to determine which previously identified preadmission criteria were predictors of graduation success from a private PT program in the southwestern United States by using a rigorous statistical analysis representative of the multi-faceted admissions process. We hypothesized that verbal, quantitative, and analytical GRE rank percentile (VGRE%, QGRE%, and AGRE%), the admissions interview, SGPA, UGPA, and the reflective essay would all be predictors of NPTE performance and GGPA.
Methods
Ethical statement
This study was approved by the Institutional Review Board at Midwestern University in Glendale, AZ, USA (#970).
Study design
This was a retrospective cohort study.
Materials and/or subjects
Data from admission to graduation of the graduating cohorts of 2013–2016 (n=149) at a 3-year graduate-level Doctor of PT program were collected. Seventy percent of the students from 2013, 91% from 2014, 87% from 2015, and 84% from 2016 released the data necessary for this study’s statistical analyses.
Technical information
Demographics at admission, preadmission criteria, and interview and essay scores were compiled via records from the Office of Admissions. The GGPA was obtained from the Registrar’s Office and the released NPTE scores were provided by the PT program.
Statistics
Independent variables
Preadmission predictors included VGRE%, QGRE%, AGRE%, the admissions interview, SGPA, UGPA, and the reflective essay. The applicants were scored by 2 raters at the in-person, on-campus admissions interview. Scores were based on a rubric with 7 subscales (4 points per subscale for a total of 28 points), including appearance, body language, communication, experience, knowledge of the PT profession, teamwork/interpersonal skills, and resilience/planning/organization. Reflective essays (composed during the interview) were scored by 1 rater (a PT Admissions Committee member) and scores were based on a rubric with 2 subscales (4 points per subscale, for a total of 8 points) including ethics/integrity and problem solving/critical thinking.
Dependent variables
The NPTE score (out of 800 points) served as the outcome of graduation success for the initial analysis, and the GGPA served as the outcome for the secondary analysis.
With statistical significance set at α<0.05, all statistical analyses were conducted using IBM SPSS ver. 24.0 (IBM Corp., Armonk, NY, USA). Descriptive statistics provided an overview of the mean values of the outcome variables. All models were assessed for normality. A 2-way random intraclass correlation coefficient (ICC) for consistency was selected to measure the interrater reliability of the interview scores. Two separate mixed-effects models, nested by graduating cohort, were used to determine the associations of preadmission predictors with NPTE performance and GGPA between and within graduating cohorts. Correlated predictors were residualized because of concerns regarding multicollinearity and subsequent effects on beta coefficient estimates. In the initial analysis of preadmission predictors and NPTE, step 1 involved configuring the unconditional means model (UMM). The remaining steps involved inputting VGRE% (step 2), QGRE% (step 3), AGRE% (step 4), the interview score (step 5), SGPA (step 6), UGPA (step 7), the residualized product term interaction between UGPA and SGPA (UGPA×SGPA, step 8), the essay score (step 9), GGPA (step 10), and the residualized product term interaction between UGPA and GGPA (UGPA×GGPA, step 11). For the secondary analysis of preadmission predictors and GGPA, step 1 involved configuring the UMM. The remaining steps involved inputting VGRE% (step 2), QGRE% (step 3), AGRE% (step 4), the interview score (step 5), SGPA (step 6), UGPA (step 7), UGPA×SGPA (step 8), and the essay score (step 9).
The analyses adhered to the hierarchical maximum likelihood method [5]. Only fixed effects (i.e., on average across all cohorts) were explored in the models. The model did not converge when random effects were entered, causing individual-level differences to be computed as redundant, most likely because of the limited number of cohorts in the model. Only level-one or within-individual variables, and not level-two or between-cohort variables, were explored. Explanation of variance between and within cohorts was determined by the pseudo R2 [5]. The negative 2-log likelihood, Akaike information criterion, and Schwarz’s Bayesian information criterion were used for model comparison.
Results
The results for the demographics and descriptive statistics of the sample are presented in Table 1. There was a slightly greater representation of males compared to females. The mean age was similar across the graduating cohorts (range, 21 to 36 years). The sample was predominantly Caucasian, with Hispanic students representing the second largest ethnic background. Seventy-three percent of students originated from 24 states other than Arizona. The mean GGPA was similar across graduating cohorts, ranging from 2.96 to 3.95. The licensure first-time pass rate and scores on the NPTE demonstrated an upward, stabilizing trend. Overall, there were 9 first-time failures on the NPTE, and scores ranged from 515 to 800 points.
Interrater reliability of the interview scores
Spearman correlation analysis was used to determine the associations between the interview scores of rater 1 and rater 2 (r=0.5, P=0.001). The average-measures ICC (0.7, P=0.001) and the single-measures ICC (0.5, P=0.001) were significant for 105 students out of the total sample (interview scores were not available for the entire 2013 graduating cohort, 9 members of the 2014 graduating cohort, and 4 members of the 2015 graduating cohort).
Mixed-effects model associated with the National Physical Therapy Examination
The UMM (step 1) from the mixed-effects model for NPTE (Table 2) suggested that 97% of the total variance in NPTE was within-cohort (ICC=0.03). Step 2 revealed no associations between the main effects of VGRE% (P=0.75) with NPTE performance. Adding VGRE% in step 2 impacted model fit (P=0.001), but did not help explain the within-cohort (R2=-0.0034) or between-cohort (R2=-0.023) variance. Step 3 revealed no associations of the main effects of VGRE% and QGRE% (P≥0.15) with NPTE performance. Inputting QGRE% in step 3 did not impact model fit (P=0.17) and explained 0.6% of the within-cohort and 45.2% of the between-cohort variance. Step 4 revealed no associations of the main effects of VGRE%, QGRE%, and AGRE% (P≥0.07) with NPTE performance. Adding AGRE% in step 4 did not impact model fit (P=0.08) and explained 1.0% of the within-cohort and 89.2% of the between-cohort variance. Since QGRE% and AGRE% in the initial levels of the analysis helped to explain all the between-cohort variance, although the values were small at 3%, additional intercept values under random effects beyond step 4 were determined to be not applicable.
The remaining predictors in steps 5–11 decreased the overall within-cohort (between-person) variance. Step 5 revealed no associations of the main effects of VGRE%, QGRE%, AGRE%, and the interview (P≥0.14) with NPTE performance. Adding the interview score to step 5 impacted model fit (P=0.001) and explained 20.4% of the within-cohort variance. Step 8 conveyed no associations of the main effects of VGRE%, QGRE%, AGRE%, interview score, SGPA, UGPA, and UGPA×SGPA (P≥0.12) with NPTE performance. Adding SGPA, UGPA, and UGPA×SGPA from steps 5-8 did not impact model fit (degrees of freedom [df]=3, P=0.75) and explained an additional 1.2% of the within-cohort variance. Step 9 revealed no associations of the main effects of VGRE%, QGRE%, AGRE%, interview score, SGPA, UGPA, UGPA×SGPA, and the reflective essay (P≥0.07) with NPTE performance. Inputting the essay in step 9 impacted model fit (P=0.001) and explained 12.7% of the within-cohort variance. Lastly, step 11 revealed the only association of this initial model, between GGPA and NPTE performance (P=0.001). The other predictors remained without a significant association. For every 1-point increase in GGPA, the NPTE score increased by 125.2 points. Model fit improved from steps 9 to 11 (df=2, P=0.001) and explained 42.0% of the within-cohort variance. The model fit comparison from steps 1–11 showed significant results (df=10, P=0.001), between-cluster variance was fully explained, and within-cohort variance decreased by 59.5%.
Mixed effects model associated with the grade point average at the time of graduation
The UMM (step 1) from the mixed-effects model for GGPA (Table 3) suggested that 100% of the total variance in GGPA was within-cohort (ICC=0.00). The initial levels of the secondary analysis (step 4) revealed no associations of the main effects of VGRE%, QGRE%, and AGRE% (P≥0.50) with GGPA. Adding AGRE% in step 4 did not impact model fit (P=0.50) and explained 0.3% of the within-cohort variance. Step 5 revealed an association between the main effects of the interview and GGPA (P=0.04), but no associations of VGRE%, QGRE%, and AGRE% (P≥0.64) with GGPA were found. Inputting the interview into step 5 impacted model fit (P=0.02) and explained 13.4% of the within-cohort variance. Step 6 again revealed an association between the interview and GGPA (P=0.04), but no associations of VGRE%, QGRE%, AGRE%, and SGPA (P≥0.54) with GGPA were found. Adding SGPA into step 6 had no impact on model fit (P=0.54) and explained 0.4% of the within-cohort variance. Step 7 revealed associations between the interview and GGPA (P=0.02) and between UGPA and GGPA (P=0.02), but no associations were found between the other predictors (P≥0.48) and GGPA. Adding UGPA in step 7 impacted model fit (P=0.02) and explained an additional 5.4% of the within-cohort variance. Associations between the interview and GGPA (P=0.02) and between UGPA and GGPA (P=0.04) remained significant in step 8, but no associations were revealed between the other predictors (P≥0.37) and GGPA. There was no impact on model fit from steps 7–8 (P=0.51) and an additional 0.4% of within-cohort variance was explained. Lastly, the association between the interview and GGPA (P=0.04) was again significant in step 9. The other predictors listed remained without a significant association (P≥0.23). For every 1-point increase of the interview score, the GGPA increased by 0.02 points. Inputting the essay scores in step 9 impacted model fit (P=0.001), but did not help to explain the within-cohort variance (R2=−0.007). The model fit comparison from steps 1–9 did not yield significant results (df=8, P=0.56) and within-cohort variance decreased by 16.4%.
Discussion
Our findings suggest that GGPA predicted NPTE performance, and that the admissions interview and UGPA predicted GGPA. Overall, GGPA, the interview, and the essay best explained within-cohort variance for NPTE performance, and VGRE%, the interview, the essay, and GGPA impacted model fit. Of all the criteria tested in the initial analysis, only GGPA was associated with NPTE performance. This lone finding provided a rationale for conducting a secondary analysis with GGPA as the outcome. For GGPA, the interview and UGPA best explained within-cohort variance, and VGRE%, the interview, the essay and UGPA impacted model fit. Of all the criteria tested in the secondary analysis, the interview and UGPA were associated with GGPA. Unlike past evidence [2,3], this study did not find any significant associations between SGPA and the outcomes of interest. Since the interview helped to explain the within-cohort variance and impacted model fit for both NPTE performance and GGPA, it exhibited a significant impact. While time, expense, and implicit biases [6] serve as deterrents to hosting admissions interviews, results of this research suggest the 41% [7] of PT programs currently not requiring interviews should potentially re-consider.
This study suggests that VGRE% and the reflective essay should be considered as predictors of graduation success. Although Moneta-Koehler et al. [8] in 2017 revealed that the GRE did not predict success in biomedical graduate school, the VGRE% in this study impacted model fit for both NPTE performance and GGPA. This finding for VGRE% indicates that it remains a valid predictor. The reflective essay also warrants attention because of its impact on model fit for both NPTE performance and GGPA. This provides validity for continued use of the essay as an assessment of the reflective skills necessary for graduation success.
Although between-cohort variance for NPTE performance was fully explained by QGRE% and AGRE%, these differences were minimal. No between-cohort variance was expressed for the GGPA model. The minimal or absent between-cohort variance in the NPTE and GGPA models, respectively, confirmed the homogeneity across graduating cohorts from this PT program. Within-cohort variance explained by QGRE% and AGRE% was also minimal, indicating the lack of significance of these GRE subscales in predicting NPTE performance.
While mixed-effects modeling provides a rigorous statistical analysis representative of the multi-faceted admissions process, 37.4% of what predicted NPTE and 83.6% of what predicted GGPA remained unexplained by the multiple predictors analyzed. Other, yet to be identified, predictors of graduation success must explain the remaining variance in NPTE performance and GGPA. The authors support the movement toward more holistic admissions [9-11] and believe that the variables considered as part of that process will further explain what predicts graduation success in the field of PT education. Further analysis needs to be done on the objective measures used to quantify an applicant’s interpersonal skills, resilience, tenacity, and grit. The generalizability of these findings is limited since data were collected from only 1 program; however, 25 different home states were represented, which is a fair representation of a nationally based sample. Future collaboration with local and national PT programs will further enhance the generalizability of our findings and help inform the admissions practices of PT programs on a broader scale.
Meanwhile, the outcomes of these analyses have informed adjustments made to the weighting of each preadmission predictor within the ranking matrix used by our PT program. Out of a total of 100%, the average interview score remains a strong determinant at 25%, and the reflective essay continues to account for 10% of an applicant’s rank. Previously weighted at 15% and 25%, respectively, UGPA and SGPA now determine 25% and 15% of an applicant’s rank. The VGRE%, QGRE%, and AGRE%, which were previously weighted at 10%, 5%, and 10%, now determine 15%, 5%, and 5% of an applicant’s rank, respectively. We recommend that admissions ranking matrices designate greater weight to the admissions interview, UGPA, VGRE%, and the essay.
Notes
Authors’ contributions
Conceptualization: GR. Data curation: GR. Formal analysis: GR, MB. Methodology: GR, MB. Project administration: GR. Visualization: GR. Writing–original draft: GR. Writing–review and editing: GR, MB.
Conflict of interest
No potential conflict of interest relevant to this article was reported.
Funding
This work was supported by the American Physical Therapy Association, Academy of Physical Therapy Education Adopt-a-Doc Scholarship (Roman). This research did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sectors.
ACKNOWLEDGMENTS
These authors would like to thank Tracey Ramirez, Assistant Director in the Office of Admissions, Byron Russell, Director of the PT Program, and Virginia Bell, Associate Registrar of the Registrar’s Office at Midwestern University for their assistance with acquiring the necessary data to complete this work.
1. Ruscingno G, Zipp GP, Olson V. Admission variables and academic success in the first year of the professional phase in a doctor of physical therapy program. J Allied Health. 2010; 39:138–142.
2. Nuciforo M, Litvinsky Y, Rheault W. Variables predictive of admission to US physical therapist education programs. J Phys Ther Educ. 2014; 28:112–119. https://doi.org/10.1097/00001416-201407000-00012.
3. Fell N, Mabey R, Mohr T, Ingram D. The preprofessional degree: is it a predictor of success in physical therapy education programs? J Phys Ther Educ. 2015; 29:13–21. https://doi.org/10.1097/00001416-201529030-00004.
4. Jones PE, Simpkins S, Hocking JA. Imperfect physician assistant and physical therapist admissions processes in the United States. J Educ Eval Health Prof. 2014; 11:11. https://doi.org/10.3352/jeehp.2014.11.11.
5. Hekler EB, Buman MP, Ahn D, Dunton G, Atienza AA, King AC. Are daily fluctuations in perceived environment associated with walking? Psychol Health. 2012; 27:1009–1020. https://doi.org/10.1080/08870446.2011.645213.
6. Capers Q 4th, Clinchot D, McDougle L, Greenwald AG. Implicit racial bias in medical school admissions. Acad Med. 2017; 92:365–369. https://doi.org/10.1097/ACM.0000000000001388.
7. Physical Therapy Central Application Service. PTCAS application instructions [Internet]. Alexandria (VA): Physical Therapy Central Application Service;2018. [cited 2017 Apr 5]. Available from: http://www.ptcas.org.
8. Moneta-Koehler L, Brown AM, Petrie KA, Evans BJ, Chalkley R. The limitations of the GRE in predicting success in biomedical graduate school. PLoS One. 2017; 12:e0166742. https://doi.org/10.1371/journal.pone.0166742.
UMM, unconditional means model; SE, standard error; VGRE%, verbal Graduate Record Examination rank percentile; QGRE%, quantitative Graduate Record Examination rank percentile; AGRE%, analytical Graduate Record Examination rank percentile; SGPA, precumulative science grade point average; UGPA, precumulative grade point average; GGPA, grade point average at the time of graduation; NA, not applicable.
* P<0.001.
a) Indicates Z-value.
Table 3.
Mixed effects model associated with cumulative grade point average at the time of graduation
UMM, unconditional means model; SE, standard error; VGRE%, verbal Graduate Record Examination rank percentile; QGRE%, quantitative Graduate Record Examination rank percentile; AGRE%, analytical Graduate Record Examination rank percentile; SGPA, precumulative science grade point average; UGPA, precumulative grade point average; NA, not applicable.