Abstract
Purpose
The primary aim of this study was to develop a survey addressing an individual’s non-cognitive traits, such as emotional intelligence, interpersonal skills, social intelligence, psychological flexibility, and grit. Such a tool would provide beneficial information for the continued development of admissions standards and would help better capture the full breadth of experience and capabilities of applicants applying to doctor of physical therapy (DPT) programs.
Methods
This was a cross-sectional survey study involving learners in DPT programs at 3 academic institutions in the United States. A survey was developed based on established non-proprietary, non-cognitive measures affiliated with success and resilience. The survey was assessed for face validity, and exploratory factor analysis (EFA) was used to identify subgroups of factors based on responses to the items.
Results
A total of 298 participants (90.3%) completed all elements of the survey. EFA yielded 39 items for dimensional assessment with regression coefficients < 0.4. Within the 39 items, 3 latent constructs were identified: adaptability (16 items), intuitiveness (12 items), and engagement (11 items).
Although there are many metrics for determining an applicant’s cognitive traits, few metrics are known to assess a prospective student’s non-cognitive abilities [1]. Non-cognitive traits encompass a prospective student’s behaviors, motivations, and personality [2]. Emotional intelligence, which encompasses managing one’s own emotions and interpersonal relationships, has been correlated with improved grade point average in nursing programs [3]. In medical students, academic success has also been correlated with emotional intelligence [4]. Improved patient outcomes and adherence to treatment plans have been shown to occur when healthcare practitioners possess desirable non-cognitive traits, such as integrity, motivation, strong interpersonal and intrapersonal communication skills, and the ability to collaborate with colleagues [5]. With the absence of a non-cognitive metric, admissions committees struggle to identify students who will thrive in their educational setting and ultimately develop into excellent clinicians [6].
The purpose of this study was to develop a preliminary tool to identify the non-cognitive traits of prospective learners. Using preexisting validated items from non-cognitive questionnaires, we implemented qualitative and quantitative processes to identify items that were succinct and discriminative of non-cognitive traits. The overarching goal of this study was to initiate the development of a non-cognitive metric that physical therapy admissions committees can use to evaluate an individual’s full breadth of potential. Subsequently, non-cognitive traits can be assessed in future studies to identify associations with academic and clinical excellence and leadership.
The study was approved by the institutional ethics boards of Duke University (#83516), the University of Colorado (#17-1228), and the University of Indianapolis (#0828) after receiving informed consent from the subjects.
This was a cross-sectional survey study involving doctor of physical therapy (DPT) programs at 3 academic institutions.
Two researchers (CS and CC) performed a literature search to identify key non-proprietary, non-cognitive measures associated with success and resilience. The intent was to include a wide range of standardized non-cognitive measurement sources that reflected emotional intelligence, interpersonal skills, social intelligence, resilience, psychological flexibility, and/or grit. Questionnaires that were well represented in the literature and targeted specific constructs were sought out. The following 6 questionnaires were identified: (1) the Schutte Self Report Emotional Intelligence Test [7], (2) the Interpersonal Reactivity Index (IRI) [8], (3) the Intolerance of Uncertainty Scale (IUS) [9], (4) Measuring Social Intelligence (MSI) [10], (5) the Psychological Flexibility Questionnaire [11], and (6) the Short Grit Scale (Grit-S) [12].
The Schutte Self Report Emotional Intelligence Test is a 33-item measure of emotional intelligence based on the work of Salovey and Mayer’s emotional intelligence model [13] and then refined by Schutte et al. [7] in 1998. The test requires respondents to determine how much each statement describes them, utilizing a 5-point Likert scale ranging from 5 (strongly agree) to 1 (strongly disagree). Higher response scores correspond to higher emotional intelligence, and individuals with higher emotional intelligence have shown better abilities to manage stress in both academic and clinical settings [14]. The test demonstrated a high internal consistency among college students (Cronbach alpha= 0.87) [7]. Validation studies have demonstrated correlations with the theoretical constructs of alexithymia, attention to feelings, clarity of feelings, mood repair, optimism, and impulse control. The scale has also been demonstrated to predict the grades of first-year collegiate students, despite not assessing cognitive ability [7].
The IRI is a 28-item measure of empathy. Empathy is a foundational trait for healthcare practitioners to develop meaningful relationships with their patients, and it should be assessed and cultivated in healthcare professionals [15,16]. The IRI uses a 5-point scale ranging from 4 (describes me very well) to 0 (does not describe me well). There are 4 subscales within the IRI that address both cognitive and non-cognitive components of empathy. The 4 scales include: (1) perspective taking, (2) empathic concern, (3) fantasy, and (4) personal distress. The IRI has demonstrated moderate levels of internal consistency for females (Cronbach alpha= 0.70–0.78) and males (Cronbach alpha= 0.75–0.78) [16].
The IUS is a scale consisting of 27 items that revolve around the idea that “uncertainty is unacceptable, reflects badly on a person, and leads to frustration, stress, and the inability to take action” [9]. The IUS items are scored using a 5-point Likert scale describing statements as ranging from 5 (entirely characteristic of me) to 1 (not at all characteristic of me). Both the initial IUS version in French by Freeston and colleagues in 1994 and an English translation by Buhr and Dugas [9] in 2002 demonstrated excellent internal consistency (Cronbach alpha= 0.94) and good test-retest reliability (r= 0.74). Tolerance of uncertainty is essential in healthcare professions and has been linked to leadership potential in physicians [17], how clinicians make decisions with their patients, and how providers exchange information to build therapeutic relationships [18]. Conversely, higher intolerance of uncertainty in physicians has been associated with increased charges for services [19] and medical students with a higher intolerance of uncertainty avoid working with underserved populations [20].
The MSI scale contains 21 items, and respondents utilize a 5-point scale ranging from 4 (very often) to 0 (never) in descriptions of social scenarios. Three constructs are included in the MSI scale: (1) manipulation, (2) empathy, and (3) social irritability. It is thought that social intelligence and emotional intelligence are very closely related [21]. Social intelligence includes how one perceives and expresses emotion in new situations, both interpersonally and intrapersonally [10]. Social intelligence is also linked with the development of goals and motivation [10]. Interdisciplinary teamwork is essential for competent patient care [22]. Effective teams contain individuals who are respectful, active listeners, and seek the knowledge of teammates [23]. These necessary behaviors and attitudes that drive successful teams can be improved during education and are a significant part of healthcare educational curricula [22,23].
The Psychological Flexibility Questionnaire is a 20-item scale that measures 5 factors: (1) positive perception of change, (2) characterization of self as flexible, (3) self-characterization as open and innovative, (4) a perception of reality as dynamic and changing, and (5) a perception of reality as multifaceted [11]. The questionnaire uses a 5-point Likert scale ranging from 6 (very much) to 1 (not at all) to determine the degree to which a respondent is characterized by each statement. The questionnaire has demonstrated high reliability (Cronbach alpha= 0.918). Individuals with high psychological flexibility have demonstrated a greater ability to work towards goals in stressful environments and a lower risk of burnout [24].
The Grit-S is an 8-item scale designed to measure “perseverance and passion toward long-term goals” [25]. The Grit-S was based on original work by Duckworth et al. [25] in 2007 and then refined to its current form in 2009 by Duckworth and Quinn [12]. Items are scored on a 5-point scale ranging from 5 (very much like me) to 1 (not at all like me). Grit and resilience have received prominent attention in health education research [26]. Grit has been associated with success in both academic and professional environments [25]. The scale has also demonstrated high internal consistency and predictive validity [12,25].
The 6 identified questionnaires were obtained and each item was transposed to a document and de-identified. The face validity of the survey was assessed by 2 researchers (CS and CC) through blinded-rater agreement, and individual scale items were excluded based on: (1) the ease of determining the desired response of the question, (2) grammar and ease of understanding the items, and (3) the potential ceiling and floor effects of the question. In short, the researchers were concerned that the directionality of many of the items would allow individuals to “game” the questions and answer in a way that they thought they should, rather than based on how they felt. The agreement for the initial removal process was 75.0%. After consensus, 75 items were removed from the original 143 items, leaving a total of 68 items for dimensional analysis.
The survey was sent to students in the first- and second-year DPT programs at 3 institutions: Duke University Doctor of Physical Therapy Program, University of Colorado Doctor of Physical Therapy Program, and University of Indianapolis Krannert School of Physical Therapy. Duke University is a private university in Durham, North Carolina with a Carnegie classification of a doctoral university: highest research activity. The University of Colorado is a public university in Aurora, Colorado with a Carnegie classification of a doctoral university: highest research activity. The University of Indianapolis is a private university in Indianapolis, Indiana with a Carnegie classification of master’s colleges and universities: larger program.
The partnership of the institutions in this study revolved primarily around an interest in improving admissions processes. Ideally, researchers wanted to capture survey results from a wide range of students in diverse geographic locations and at both public and private institutions. Descriptive statistics for the cohorts were shared, including age at time of application, year in the DPT program, and sex (Table 1). These were the only descriptive statistics shared among institutions. Each institution maintained additional identifiable information for their respective cohorts.
All surveys were administered through Qualtrics software (Qualtrics, Provo, UT, USA). Each institution used the Qualtrics email function to maintain identified data for the survey respondents.
The survey was first presented to the Duke University cohorts. Researchers oriented the first- and second-year students to the purpose of the survey, and it was stressed that the survey would have no repercussions on academic standing or other adverse effects. The survey was formulated and sent to the first- and second-year classes at Duke University on March 21, 2017. A follow-up reminder email was sent on March 27, 2017 to students who had not completed the survey. The survey closed on April 1, 2017, with 144 of a possible 154 respondents, for a response rate of 93.5%.
The University of Colorado and University of Indianapolis followed a similar procedure of survey administration as Duke University. The University of Colorado cohorts received an initial letter discussing the non-cognitive survey on May 8, 2017, and the survey opened on May 9, 2017. Students received a reminder email on May 15, 2017, and the survey closed on May 27, 2017. The University of Colorado cohorts had 115 of a possible 136 respondents, for a response rate of 84.6%. Similarly, the University of Indianapolis cohorts received the initial survey on June 8, 2017, with reminders for completion on June 15 and 26, 2017. The University of Indianapolis cohorts had 71 of a possible 83 respondents, for a response rate of 85.5%.
Qualtrics provided raw identified data in Excel format for each institution. De-identified data were shared at a central location (Duke University) and the data were compiled into a common IBM SPSS ver. 24.0 file (IBM Corp., Armonk, NY, USA).
Initially, 330 surveys were initiated, of which 298 were completed. Complete item data were present in 98.6% of the data. In total, 90.3% of surveys contained complete data. Since there were few missing values, SPSS was instructed to ‘skip’ the missing values.
SPSS software was used for all analyses (IBM Corp., Armonk, NY, USA). The alpha level was set at P= 0.05 for statistical significance for all analyses. Exploratory factor analysis (EFA) was used to identify subgroups of factors based on responses to the non-cognitive items. For EFA to be appropriate, the following 2 conditions are necessary: (1) a proper sample size and (2) appropriateness of use. Although no consensus exists on the appropriate sample size for factor analysis, our sample of 298 was deemed to be near the threshold of a ‘good’ power estimate [27].
Sample adequacy was assessed via an anti-image correlation matrix, with factors less than 0.5 being excluded from the analysis. Sample adequacy was also assessed through the Kaiser-Meyer-Olkin (KMO) statistic, which is the proportion of variable variance caused by underlying factors. A KMO statistic above 0.50 was deemed acceptable for this analysis [28]. The Bartlett test of sphericity confirmed that the correlation matrix was suitable for structure detection.
Factor selection was determined based on an eigenvalue greater than 1 occurring prior to the inflexion point on a scree plot [29], which has been deemed an appropriate method of factor selection for sample sizes greater than 200 [30]. Once the variables were extracted, and after several iterations were run involving different rotation analyses, factor detection was enhanced using orthogonal (varimax) rotation [31]. Varimax rotation was selected because qualitatively, its results provided the most logical combination of groups. A variable was considered as a defining part of a factor if the factor regression coefficient was greater than or equal to 0.4 [28]. Once common indices were identified, the research team then labeled each item based on the commonality of a latent construct.
Initiated surveys were captured from 88.5% of the possible respondents, with a total of 330 students, of whom 144 (43.6%) were from Duke University, 115 (34.8%) were from the University of Colorado, and 71 respondents (21.5%) were from the University of Indianapolis. One hundred seventy-three (52.4%) respondents were first-year students and 157 (47.6%) were second-year students. Of the total respondents, 298 (90.3%) completed all elements of the survey. Raw data are available from Supplement 1.
Anti-image correlation resulted in 0 factors being removed, and the KMO statistic of 0.804 confirmed sample adequacy. The Bartlett test (8,196.245, P≤ 0.01) confirmed the suitability of structure detection. During the EFA phase, a total of 29 items were removed because each had a regression coefficient < 0.4, leaving 39 items for dimensional assessment.
The remaining 39 items loaded on 3 axes (Table 2). These 3 axes included the latent constructs of adaptability (16 items), intuitiveness (12 items), and engagement (11 items). The adaptability construct included items from the IUS and the Psychological Flexibility Questionnaire. The intuitiveness construct included items from the Schutte Self Report Emotional Intelligence Test, the MSI scale, and the Grit-S. The engagement construct included items from the IRI and the Psychological Flexibility Questionnaire.
Among first- and second-year DPT students in 3 different programs, we administered items representing multiple non-cognitive constructs (e.g., emotional intelligence, resilience). In doing so, we identified the following three domains: adaptability, intuitiveness, and engagement. The survey questions in the adaptability domain address how well an individual deals with uncertainty. Example statements in the domain include, “When I am uncertain I can’t function very well” and “I am open to experiencing the different and the exceptional.” An individual’s empathy, emotional intelligence, and ability to be open-minded can be explored within the intuitiveness domain. Example statements in this domain include, “I am aware of my emotions as I experience them” and “I am able to recognize the wishes of others.” Finally, the engagement domain encompasses one’s ability to show commitment to interpersonal interactions. Example statements from this domain include, “I believe that there are two sides to every question and try to look at them” and “There are usually many possible ways to do things.” These 3 domains represent desirable traits of learners and health care professionals and correspond well to pre-existing personality constructs associated with academic success, such as openness, agreeableness, extraversion, and emotional intelligence [32]. Some of the items removed during EFA involved subscales analyzing happy emotions, social irritability, and use of manipulation in social settings. These subscales could be considered to be at the extremes of non-cognitive traits, which might have been why they did not fit into the dimensional assessment. The 3 domains identified in this preliminary instrument may be ideal for consideration in the admissions process, as well as in future studies of academic and professional success.
DPT programs are training the professional physical therapist workforce. Non-cognitive assessments have shown great value in workforce development, and it has been argued that non-cognitive skills are more valuable to employers than cognitive abilities in predicting job performance [33]. Professionalism is essential for having successful, positive interactions in clinical settings, and professionalism training in DPT curricula requires the investment of significant cost and effort [34,35]. Admissions committees should target individuals who have the ability to flourish in clinical practice and be successful as members of the physical therapy workforce. Accepted students will hopefully be individuals who work well on teams, develop strong therapeutic alliances with their patients, and seek leadership positions, but also have academic success, meet their goals, and enjoy their careers long-term.
In an attempt to better identify the desired traits of prospective students, many health professions have initiated a holistic review process. Holistic admissions reviews are an individualized process that aims to capture an applicant’s full breadth of potential. Such reviews continue to weigh a prospective student’s cognitive metrics, but have also increasingly focused on applicants’ experiences and attributes to build a more diverse and collaborative workforce. Holistic review processes have been implemented across the health professions of medicine, dentistry, nursing, and physical therapy [36,37,38]. It has been argued that personality traits should be used to predict academic success in higher education [39]. However, the desired traits can be difficult to identify due to a lack of standardized metrics.
The 39 items included in our preliminary survey had strong construct validity, dimensionality, and content validity. Our initial removal phase was designed to improve face validity. However, for there to be value in this tool, our next steps going forward will require assessment of concurrent validity with academic performance, National Physical Therapy Exam (NPTE) pass rates, and institution-specific alumni milestones. This instrument will require continued evolution as DPT programs examine survey results. The researchers at each institution in this study plan to compare respondents’ completed survey findings with their future academic success in both the didactic and clinical portions of the curriculum. The survey results will also be compared with the pass rate on the NPTE and program-specific alumni milestones. The potential use of this tool for admissions decisions has yet to be established. Furthermore, our sample included current DPT students who had already matriculated into a program. Neissen et al. suggested that individuals may respond differently to questionnaires when high-stakes scenarios such as graduate school admissions are involved [40]. There are also inherent difficulties with self-reported measures, as individuals may not answer surveys honestly as they try to determine the desired responses. Furthermore, if such a survey were to be used in admissions decisions, there is the potential that prospective students may seek coaching to become more attractive candidates.
Each institution has unique priorities for the desired traits of prospective students, and the proposed non-cognitive evaluation tool will hopefully enable programs to improve their ability to identify prospective learners who correspond with their desired traits. This survey could play an important role in addressing non-cognitive traits of prospective learners in a holistic admissions review process. Ideally, traits could be identified that would correlate with academic and clinical success to better identify individuals during the admissions process who reflect the unique characteristics of each program.
Notes
References
1. Cook C. 20th Pauline Cerasoli lecture: the Sunk Cost Fallacy. J Phys Ther Educ. 2017; 31:10–14. https://doi.org/10.1097/00001416-201731030-00005.
2. Megginson L. Noncognitive constructs in graduate admissions: an integrative review of available instruments. Nurse Educ. 2009; 34:254–261. https://doi.org/10.1097/NNE.0b013e3181bc7465.
3. Sharon D, Grinberg K. Does the level of emotional intelligence affect the degree of success in nursing studies? Nurse Educ Today. 2018; 64:21–26. https://doi.org/10.1016/j.nedt.2018.01.030.
4. Cook CJ, Cook CE, Hilton TN. Does emotional intelligence influence success during medical school admissions and program matriculation?: a systematic review. J Educ Eval Health Prof. 2016; 13:40. https://doi.org/10.3352/jeehp.2016.13.40.
5. Koenig TW, Parrish SK, Terregino CA, Williams JP, Dunleavy DM, Volsch JM. Core personal competencies important to entering students’ success in medical school: what are they and how could they be assessed early in the admission process? Acad Med. 2013; 88:603–613. https://doi.org/10.1097/ACM.0b013e31828b3389.
6. Guffey JS, Farris JW, Aldridge R, Thomas T. An evaluation of the usefulness of noncognitive variables as predictors of scores on the national physical therapy licensing examination. J Allied Health. 2002; 31:78–86.
7. Schutte NS, Malouff JM, Hall LE, Haggerty DJ, Cooper JT, Golden CJ, Dornheim L. Development and validation of a measure of emotional intelligence. Pers Individ Dif. 1998; 25:167–177. https://doi.org/10.1016/S0191-8869(98)00001-4.
8. Davis MH. A multidimensional approach to individual differences in empathy. JSAS Cat Sel Doc Psychol. 1980; 10:85.
9. Buhr K, Dugas MJ. The Intolerance of Uncertainty Scale: psychometric properties of the English version. Behav Res Ther. 2002; 40:931–945. https://doi.org/10.1016/S0005-7967(01)00092-4.
10. Frankovsky M, Birknerova Z. Measuring social intelligence-the MESI methodology. Asian Soc Sci. 2014; 10:90–97. https://doi.org/10.5539/ass.v10n6p90.
11. Ben-Itzhak S, Bluvstein I, Maor M. The psychological flexibility questionnaire (PFQ): development, reliability and validity. Webmed Cent Psychol. 2014; 5:WMC004606. https://doi.org/10.9754/journal.wmc.2014.004606.
12. Duckworth AL, Quinn PD. Development and validation of the short grit scale (grit-s). J Pers Assess. 2009; 91:166–174. https://doi.org/10.1080/00223890802634290.
13. Salovey P, Mayer JD. Emotional intelligence. Imagin Cogn Pers. 1990; 9:185–211. https://doi.org/10.2190/dugg-p24e-52wk-6cdg.
14. Enns A, Eldridge GD, Montgomery C, Gonzalez VM. Perceived stress, coping strategies, and emotional intelligence: a cross-sectional study of university students in helping disciplines. Nurse Educ Today. 2018; 68:226–231. https://doi.org/10.1016/j.nedt.2018.06.012.
15. Hojat M, Mangione S, Kane GC, Gonnella JS. Relationships between scores of the Jefferson Scale of Physician Empathy (JSPE) and the Interpersonal Reactivity Index (IRI). Med Teach. 2005; 27:625–628. https://doi.org/10.1080/01421590500069744.
16. Yu J, Kirk M. Evaluation of empathy measurement tools in nursing: systematic review. J Adv Nurs. 2009; 65:1790–1806. https://doi.org/10.1111/j.1365-2648.2009.05071.x.
17. Sherrill WW. Tolerance of ambiguity among MD/MBA students: implications for management potential. J Contin Educ Health Prof. 2001; 21:117–122. https://doi.org/10.1002/chp.1340210209.
18. Hillen MA, Gutheil CM, Strout TD, Smets EM, Han PK. Tolerance of uncertainty: conceptual analysis, integrative model, and implications for healthcare. Soc Sci Med. 2017; 180:62–75. https://doi.org/10.1016/j.socscimed.2017.03.024.
19. Allison JJ, Kiefe CI, Cook EF, Gerrity MS, Orav EJ, Centor R. The association of physician attitudes about uncertainty and risk taking with resource use in a Medicare HMO. Med Decis Making. 1998; 18:320–329. https://doi.org/10.1177/0272989x9801800310.
20. Wayne S, Dellmore D, Serna L, Jerabek R, Timm C, Kalishman S. The association between intolerance of ambiguity and decline in medical students’ attitudes toward the underserved. Acad Med. 2011; 86:877–882. https://doi.org/10.1097/ACM.0b013e31821dac01.
21. Bar-On R. The Bar-On model of emotional-social intelligence (ESI). Psicothema. 2006; 18 Suppl:13–25.
22. Marlow SL, Hughes AM, Sonesh SC, Gregory ME, Lacerenza CN, Benishek LE, Woods AL, Hernandez C, Salas E. A systematic review of team training in health care: ten questions. Jt Comm J Qual Patient Saf. 2017; 43:197–204. https://doi.org/10.1016/j.jcjq.2016.12.004.
23. Salas E, Zajac S, Marlow SL. Transforming health care one team at a time: ten observations and the trail ahead. Group Organ Manag. 2018; 43:357–381. https://doi.org/10.1177/1059601118756554.
24. Lloyd J, Bond FW, Flaxman PE. The value of psychological flexibility: examining psychological mechanisms underpinning a cognitive behavioural therapy intervention for burnout. Work Stress. 2013; 27:181–199. https://doi.org/10.1080/02678373.2013.782157.
25. Duckworth AL, Peterson C, Matthews MD, Kelly DR. Grit: perseverance and passion for long-term goals. J Pers Soc Psychol. 2007; 92:1087–1101. https://doi.org/10.1037/0022-3514.92.6.1087.
26. Stoffel JM, Cain J. Review of grit and resilience literature within health professions education. Am J Pharm Educ. 2018; 82:6150. https://doi.org/10.5688/ajpe6150.
27. MacCallum RC, Widaman KF, Zhang S, Hong S. Sample size in factor analysis. Psychol Methods. 1999; 4:84–99. https://doi.org/10.1037/1082-989X.4.1.84.
28. Field A. Discovering statistics: using SPSS for Windows. London: Sage Publications;2001.
29. Cattell RB. The scree test for the number of factors. Multivariate Behav Res. 1966; 1:245–276. https://doi.org/10.1207/s15327906mbr0102_10.
30. Stevens J. Applied multivariate statistics for the social sciences. Mahwah (NJ): Lawrence Erlbaum Associates;2002.
31. Garson GD. Factor analysis. Asheboro (NC): Statistical Associates Publishers;2013.
32. Richardson M, Abraham C, Bond R. Psychological correlates of university students’ academic performance: a systematic review and metaanalysis. Psychol Bull. 2012; 138:353–387. https://doi.org/10.1037/a0026838.
33. Connolly C, Gubbins C, Murphy E. Non-cognitive influences on trainee learning within the manufacturing industry. In : Proceedings of the 2010 IEEE Transforming Engineering Education: Creating Interdisciplinary Skills for Complex Global Environments; 2010 Apr 6-9; Dublin, Ireland. Piscataway (NJ). Institute of Electrical and Electronics Engineers. 2010. 1–6. https://doi.org/10.1109/TEE.2010.5508954.
34. Osadnik CR, Paynter SL, Maloney SR. Re: admission interview scores are associated with clinical performance in an undergraduate physiotherapy course: an observational study. Physiotherapy. 2016; 102:119–120. https://doi.org/10.1016/j.physio.2015.02.001.
35. Hayward LM, Blackmer B. A model for teaching and assessing core values development in doctor of physical therapy students. J Phys Ther Educ. 2010; 24:16–26. https://doi.org/10.1097/00001416-201007000-00003.
36. Wise D, Dominguez J, Kapasi Z, Williams-York B, Moerchen V, Brooks S, Ross LJ. Defining underrepresented minorities and promoting holistic review admission strategies in physical therapist education. J Phys Ther Educ. 2017; 31:8–13. https://doi.org/10.1097/jte.0000000000000009.
37. Glazer G, Clark A, Bankston K, Danek J, Fair M, Michaels J. Holistic admissions in nursing: we can do this. J Prof Nurs. 2016; 32:306–313. https://doi.org/10.1016/j.profnurs.2016.01.001.
38. Urban Universities for Health. Holistic admissions in the health professions: findings from a national survey [Internet]. Washington (DC): Urban Universities for Health;2014. [cited 2018 Feb 02]. Available from: http://urbanuniversitiesforhealth.org/media/documents/Holistic_Admissions_in_the_Health_Professions.pdf.
39. Kappe R, van der Flier H. Predicting academic success in higher education: what’s more important than being smart? Eur J Psychol Educ. 2012; 27:605–619. https://doi.org/10.1007/s10212-011-0099-9.
40. Niessen AS, Meijer RR, Tendeiro JN. Measuring non-cognitive predictors in high-stakes contexts: the effect of self-presentation on selfreport instruments used in admission to higher education. Pers Individ Dif. 2017; 106:183–189. https://doi.org/10.1016/j.paid.2016.11.014.