Abstract
Purpose:
This study aimed to test the construct validity of an instrument to measure student professional behaviors in entry-level occupational therapy (OT) students in the academic setting.
Methods:
A total of 718 students from 37 OT programs across the United States answered a self-assessment survey of professional behavior that we developed. The survey consisted of ranking 28 attributes, each on a 5-point Likert scale. A split-sample approach was used for exploratory and then confirmatory factor analysis.
Results:
A three-factor solution with nine items was extracted using exploratory factor analysis [EFA] (n=430, 60%). The factors were ‘Commitment to Learning’ (2 items), ‘Skills for Learning’ (4 items), and ‘Cultural Competence’ (3 items). Confirmatory factor analysis (CFA) on the validation split (n=288, 40%) indicated fair fit for this three-factor model (fit indices: CFI=0.96, RMSEA=0.06, and SRMR=0.05). Internal consistency reliability estimates of each factor and the instrument ranged from 0.63 to 0.79.
Conclusion:
Results of the CFA in a separate validation dataset provided robust measures of goodness-of-fit for the three-factor solution developed in the EFA, and indicated that the three-factor model fitted the data well enough. Therefore, we can conclude that this student professional behavior evaluation instrument is a structurally validated tool to measure professional behaviors reported by entry-level OT students. The internal consistency reliability of each individual factor and the whole instrument was considered to be adequate to good.
A survey of 166 rehabilitation professional educators across the United States (US) indicated that 89% of the respondents expressed concern about the professional behaviors of one or more of their entry-level students [1]. As students progress through the curriculum, their professional behaviors need to be monitored and evaluated. One approach to monitoring student professional behaviors is the use of self-assessment [2]. Despite these seemingly endorsed methods we could not find any validated instruments to assess health professions students’ professional behaviors in the academic setting before their clinical placement. There was no study of which constructs of assessments was validated structurally. In this study we developed and validated an instrument that evaluates entry-level occupational therapy (OT) student professional behaviors in the academic setting. Content validity of the item-pool for the instrument was conducted via expert review and pilot testing. Construct validity of the instrument was assessed via exploratory and confirmatory factor analyses using a split sample approach.
Since there was no sampling frame of all current entry-level students enrolled in OT programs in the US, an initial step was to develop a way to contact them. To achieve this, we contacted the president of the student occupational therapy association (SOTA) of each OT program in the US through regular mail. An introductory letter was sent to 151 accredited entry-level OT programs nationwide with the intention of establishing a professional relationship with the SOTA president through email address exchange. After an initial mailing, 33 SOTA presidents responded; subsequently, the second round of mails were sent to those non-respondents. Following the second round of mailing, 40 additional responses were received for a total 73 presidents from SOTAs.
Having compiled a database of 73 SOTA presidents as the contact person for potential OT student participants, we conducted this study using survey methods. In late September of 2014, we emailed an invitation letter explaining the purpose of the study and the survey instrument Uniform Resource Locator (URL) to each of the 73 SOTA presidents with the request to forward the hyperlink of the survey, a questionnaire on student professional behaviors, to all OT students in their program. The survey was posted on Survey Monkey, an online survey service, which provided the URL for the questionnaire.
After the first round of emailing, 299 responses were received. The second round of emails were sent out in mid-October of 2014. An additional 453 replied for a total of 752 responses received following this second round of emailing. Data collection was conducted between late September and early November of 2014. Students from 37 programs completed the survey. Of the 752 responses, 34 respondents only completed the first 10 or 19 items of the questionnaire, leaving about one-third of the items (i.e., nine) and the demographic information unanswered. We excluded these 34 responses; as a result, the final sample consisted of 718 respondents.
The questionnaire in this survey consisted of two sections. The first section had 28 items (i.e., attributes) related to various student professional behaviors (Appendix 1). Among these 28 items, we have a priori grouped them into 3 subsections, with attributes dealing with tasks (i.e., learning, 10 items), dealing with oneself (9 items) and dealing with others (9 items) [3]. Exhibition of each attribute was rated on a 5-point scale, ranging from never=1, rarely=2, occasionally=3, frequently=4 to always=5. The second section had six items on demographic information of the respondents which included age, gender, race, what year they were in the OT program, how many level II fieldwork rotations they have completed, and in which OT program they were enrolled. The 28 items were drawn mainly from the literature [4-6] and departmentally-developed student professional behavior forms from several academic health professions programs.
Content of the student professional behavior questionnaire was validated by three experienced academicians, including an academic coordinator of fieldwork education, each had more than 10 to 20 years teaching experience in the OT program. The preliminary version of the questionnaire was then distributed to a group of 30 first year entry-level OT students for comment and feedback. The questionnaire was finalized after incorporating their ideas and comments.
The collected data were tabulated and analyzed using descriptive statistics. Exploratory and confirmatory factor analyses were conducted to determine an underlying structure for the 28 items related to student professional behaviors. Cronbach’s alpha was used to estimate the internal consistency reliability of the emerging factors.
We randomly split the sample of 718 into two, with 60% of the data (n=430) subjected to exploratory factor analysis (EFA), and 40% (n=288) subjected to confirmatory factor analysis (CFA) for verification of the factor structure derived from EFA. An EFA with maximum likelihood extraction method was used to preliminarily evaluate the dimensionality or structural validity of the 28 items related to student professional behaviors. The use of a maximum likelihood extraction method to determine the factor structure was recommended as it generates the solution that most accurately reflects the underlying population pattern especially when loadings within factors are unequal and under-extraction [7]. A variable was considered important in explaining the variance of a factor if its factor loading exceeded 0.4. As factors emerging from the data were expected to be correlated, a direct oblimin oblique rotation method was used to achieve a simpler structure for interpretation. The number of extracted factors was determined based on the scree plot and Guttman-Kaiser criterion that specifies retaining factors with eigenvalues greater than 1. Goodness-of-fit of the CFA models was tested using the direct Robust Maximum Likelihood method implemented in SAS v.9.4 software, CALIS procedure [8]. Adequacy of the CFA models was examined using Bentler’s comparative fit index (CFI), root mean squared error of approximation (RMSEA), and standardized root mean square residual (SRMR). In general, CFI ≥0.95, RMSEA ≤0.06, and SRMR ≤0.08 are indicative of good fit. SAS STAT v.9.4 software was used to conduct the statistical analysis.
Ethical approval: The study was approved by the Institutional Review Board of the University of Alabama at Birmingham (E140320001).
Of the 718 respondents, 14 of them did not complete the demographic information section of the survey. Among the 704 respondents who completed the two sections of the questionnaire, 629 (89.3%) were female, and 570 (81.0%) were Caucasian. The mean and standard deviation of the respondents’ age was 25±5.3 years old, ranging from 17 to 54. The majority (605, 85.9%) have not had any level II fieldwork experience, with 289 (41.1%) in their first year of the OT program, 301 (42.8%) in their second year, and 114 (16.2%) in their third year or beyond.
Sampling adequacy was checked to determine if the data meet the criteria for factor analysis. Results showed that sampling adequacy was good with a Kaiser-Meyer-Olkin value of 0.89 (i.e., >0.5), and Bartlett’s sphericity test was significant (P<0.001), indicating that the data were suitable for factor analysis. Four factors were extracted using the EFA. Horn’s parallel analysis confirmed that four initial factors could be extracted. The four factors of the student professional behaviors accounted for 43.0% of the variance.
However, the factor structure derived from this EFA was not confirmed in the CFA using the second part of the randomly split sample (n=288). The CFA results suggested that the four-factor model provided less than acceptable fit to the data. Therefore, a second attempt to obtain a factor structure was conducted. We ran separate EFAs for items within each of the three subsections using the maximum likelihood extraction method. In each of the three EFAs, we sequentially removed items with the lowest communality until the remaining items all had communalities greater than 0.35. The procedure yielded 9 items. The new three-factor structure was then subjected to confirmation by CFA using the first (n=430) and the second (n=288) parts of the randomly split sample.
The three factors extracted from the EFA were labeled ‘Commitment to Learning’, ‘Skills for Learning’, and ‘Cultural Competence’. The CFA on the validation split (n=288) indicated that the three-factor model provided fair fit to the data (fit indices: CFI=0.96, RMSEA =0.06, and SRMR=0.05). CFA estimates are presented in Table 1 for both sample splits. Using the whole sample (N=718), the internal consistency reliability of each of the three factors estimated by Cronbach’s alpha was: 0.63 for Commitment to Learning (two items), 0.72 for Skills for Learning (four items), and 0.79 for Cultural Competence (three items). The internal consistency reliability of the nine items was 0.79.
The factors (Commitment to Learning, Skills for Learning, and Cultural Competence) derived from each of the three subsections of the original student professional behavior questionnaire was based on dimensionality reduction using EFA, which retains some of the traits (tasks, oneself, and others) of the three initial subsections. The content validation of the 28 items grouped in the three subsections of the original questionnaire offers a strong rationale to conduct EFA in each subsection. Results of the CFA on each of the split samples (i.e., exploratory and confirmatory split) provide robust evidence that a three-factor model using the selected items fits the data well. Therefore, we can conclude that this student professional behavior evaluation instrument is a structurally validated tool to measure professional behaviors reported by entry-level OT students. The internal consistency reliability of each individual factor and the whole instrument was considered to be adequate and good, respectively.
The most frequent student professional behavior issues exhibited by rehabilitation professional students in the academic setting were lack of personal responsibility, social intolerance, disrespect of others, tardiness, missed appointments, excessive absences, failure to meet deadlines, and dress code violations [1]. The present instrument targets the assessment of some of the more abstract aspect of professional behaviors such as personal responsibility (as in skills for learning), social tolerance and respect of others (as in cultural competence), rather than the more concrete and objective behaviors such as frequency in tardiness, missing appointments, absence, and failure to meet deadlines.
Completion of this professional behavior self-assessment can be part of the student personal and professional development plan in the academic setting [9]. The information obtained from this assessment may be useful for educators to help monitor professional behaviors among entry-level students enrolled in health professions programs. With feedback from their academic advisors, this instrument can be used to help health professions students improve the affective domain of professional behaviors which is essential for the success in clinical placements. Self-assessment is just one of the many methods used to assess student professional behaviors [2]. To get a truer picture of students’ professional behaviors, a combination of other assessment methods, such as peer assessment, direct faculty observation, and student portfolios is recommended [10].
There are limitations of this study. Forthe initial pool of 28 items in the student professional behavior questionnaire, only 9 were retained in the final factor solution, which is common in rigorous validation process of an instrument. To estimate the response rate was challenging. Very few SOTA presidents let us know how many and what year of their fellow students they forwarded the invitation email of this study to, despite the fact that we requested this information several times. In the end, we looked up the website of each of the 37 respondent programs and obtained the student enrollment number. Based on the number of students enrolled posted on their website, it was estimated that there were 1,474 students admitted to all 37 entry-level programs in the year of 2015. To compute the response rate of this survey, we set 704 (respondents identified themselves enrolled in these 37 programs) as the numerator and divided it by the number of students enrolled in all 37 programs in three years which were 4,422. It was estimated that the lower bound of the response rate was 15.9% (i.e., [704/4,422]*100%). However, at least six programs had only one response which was most likely from the SOTA president of that program, and it was unclear to us whether these SOTA presidents distributed the invitation email to any of their fellow students.
Regardless of the exact response rate, we acknowledged that the participants in this study were drawn from a non-probability sampling method which may or may not represent OT students in the entry-level programs in the nation. However, it should be noted that several demographic characteristics of the respondents in the present study are remarkably similar to those reported in the Academic Programs Annual Data Report Academic Year 2014-2015 by the American Occupational Therapy Association (e.g., 81% were Caucasian compared to 82% nationwide, 89% were female compared to 89% nationwide) [11], which suggested that the sample demonstrated a good representation of the students in the entry-level OT programs in the US. Most importantly, further studies need to investigate the validity of this student professional behavior evaluation instrument in predicting success in clinical experiences.
References
1. Davis DS. Teaching professionalism: a survey of physical therapy educators. J Allied Health. 2009; 38:74–80.
2. van Mook WN, Gorter SL, O’Sullivan H, Wass V, Schuwirth LW. Approaches to professional behaviour assessment: tools in the professionalism toolbox. Eur J Intern Med. 2009; 20:e153–157. http://dx.doi.org/10.1016/j.ejim.2009.07.012.
3. van Mook WN, van Luijk SJ, O’Sullivan H, Wass V, Harm Zwaveling J, Schuwirth LW. The concepts of professionalism and professional behaviour: conflicts in both definition and learning outcomes. Eur J Intern Med. 2009; 20:e85–89. http://dx.doi.org/10.1016/j.ejim.2008.10.006.
4. Anderson DK, Irwin KE. Self-assessment of professionalism in physical therapy education. Work. 2013; 44:275–281. http://dx.doi.org/10.3233/WOR-121504.
5. Foord-May L, May W. Facilitating professionalism in physical therapy: theoretical foundations for the facilitation process. J Phys Ther Educ. 2007; 21:6–12.
6. Santasier AM, Plack MM. Assessing professional behaviors using qualitative data analysis. J Phys Ther Educ. 2007; 21:29–39.
7. de Winter JCF, Dodou D. Factor recovery by principal axis factoring and maximum likelihood factor analysis as a function of factor pattern and sample size. JAppl Stat. 2012; 39:695–710. http://dx.doi.org/10.1080/02664763.2011.610445.
8. Yuan KH, Hayashi K. Fitting data to model: structural equation modeling diagnosis using two scatter plots. Psychol Methods. 2010; 15:335–351. http://dx.doi.org/10.1037/a0020140.
9. Hills C, Ryan S, Smith DR, Warren-Forward H. The impact of ‘Generation Y’ occupational therapy students on practice education. Aust Occup Ther J. 2012; 59:156–163. http://dx.doi.org/10.1111/j.1440-1630.2011.00984.x.
10. Kress VE, Protivnak JJ. Professional development plans to remedy problematic counseling student behaviors. Counselor Educ Superv. 2009; 48:154–166. http://dx.doi.org/10.1002/j.1556-6978.2009.tb00071.x.
11. Harvison N. Academic programs annual data report academic year 2014-2015 [Internet]. Bethesda, MD: American Occupational Therapy Association;2015. [cited 2016 Apr 2]. Available from: http://www.aota.org/-/media/corporate/files/educationcareers/educators/2014-2015-annual-data-report.pdf.