Jump to ContentJump to Main Navigation
Show Summary Details
More options …

International Journal of Health Professions

The journal of Verein zur Förderung der Wissenschaft in den Gesundheitsberufen

1 Issue per year

Open Access
Online
ISSN
2296-990X
See all formats and pricing
More options …

Creating an Online Interprofessional Collaborative Team Simulation to Overcome Common Barriers of Interprofessional Education / Eine internetbasierte, interprofessionelle Teamsimulation zur Überwindung organisatorischer Hürden in der interprofessionellen Ausbildung

Kelli Lee Kramer-Jackman / Dory Sabata / Heather Gibbs / Judy Bielby / Jessie Bucheit / Sarah Bloom / Sarah Shrader
Published Online: 2017-09-15 | DOI: https://doi.org/10.1515/ijhp-2017-0022

Abstract

Introduction

Coordinating student schedules, physical space, and faculty time are commonly reported barriers to successful interprofessional education. Use of online technologies to overcome these barriers and support online team simulation is a topic that deserves serious academic review.

Methods

The Interprofessional Plan of Care - Simulated E-hEalth Delivery System (IPOC-SEEDS) is a student-directed online simulation where students experience a collaborative plan of care meeting with simultaneous team electronic health record utilization. The authors describe the IPOC-SEEDS simulation to serve as a model for replication or modification. IPOC-SEEDS objectives address Interprofessional Education Collaborative competencies (IPEC), electronic health record (EHR) navigation, simulation effectiveness, and technology utilization.

Results

Overall, IPOC-SEEDS objectives were effectively met through simulation evaluations, student-led debriefing evaluations, in-person student feedback, and faculty feedback results supporting the online simulation and technology evolutions. The objectives, based on IPEC and informatics competencies, were achieved. Students from nursing, nutrition, pharmacy, occupational therapy, and health information management participated in the simulation using EHR and online meeting software, receiving valuable interprofessional practice. Technology utilization results were adequate, but did improve in subsequent simulations after modifying the technology selected.

Discussion

The simulation provided an experience where students demonstrated interprofessional collaborative skills that they can use in their future practice. Online technologies can provide a platform for the high-quality interprofessional simulation to address common interprofessional education barriers and provide access to interprofessional education for distance-learning students and providers. Online simulation developers (hospitals, health departments, universities) can use the authors’ process steps as a model for online simulation replication.

Abstract

Hintergrund

Die Koordination von Stunden-Raum und Lehrplänen stellt oft eine schwer zu lösende Aufgabe für eine erfolgreiche interprofessionelle Ausbildung (IPE) dar. Die Nutzung von internetbasierten Technologien, die diese Hindernisse überwinden und die Teamsimulation unterstützen, ist einThema, das ernsthafte akademische Diskussion verdient. Das Interprofessional Plan of Care - Simulated E-hEalth Delivery System (IPOC-SEEDS) ist eine internetbasierte Simulation, in der Studierende mehrerer Gesundheitsberufe aufgrund einer elektronischen Patientenakte (ePA) gemeinsam einen Pflegeplan erstellen. IPOC-SEEDS zielt auf die Verbesserung des Umgangs mit elektronischen Patientenkarten (ePA) und interprofessioneller Kompetenzen. Ziel dieses Beitrags ist, IPOC-SEEDS und dessen Evaluation vorzustellen.

Methode

206 Studierende aus Pflege, Ernährung, Pharmazie, Ergotherapie und Gesundheitsinformationsmanagement nahmen an Simulationsübungen mit IPOC-SEEDS teil. Anschliessend füllten sie einen Online-Fragebogen zum Nutzen des IPOC-SEEDS aus, die Nachbesprechungen der Übungen sowie Rückmeldungen der Dozierenden wurden qualitativ ausgewertet.

Ergebnisse

Die IPOC-SEEDS Ziele wurden erfüllt. Simulationsauswertungen, Nachbesprechungen sowie Feedback seitens der Studierenden und Dozierenden untermauern den Nutzen der Online-Simulation und der technologischen Weiterentwicklungen. Die Ziele, basierend auf IPEC und Informatikkompetenzen, wurden erreicht. Die Ergebnisse der Technologienutzung waren adäquat, verbesserten sich nach der Modifizieren der ausgewählten Technologie.

Diskussion

Internet basierte Technologien bieten eine Plattform für die qualitativ hochwertige IPE Simulationen, um organisatorische Hindernisse der IPE zu beseitigen und IPE auch im Fernstudium zu ermöglichen. Online-Simulationsentwickler (Hochschulen, Krankenhäuser, Gesundheitsämter) können die Prozessschritte der Autoren als Modell für die Online-Simulationsreplikation nutzen.

Keywords: interprofessional education; Simulation; Online; Barriers; Informatics

Keywords: Interprofessionelles Lernen; Simulation; Online; Barrieren; Informatik

‘Interprofessional collaboration (IPC) occurs when two or more professions work together to achieve common goals and is often used as a means for solving a variety of problems and complex issues’Green & Johnson, 2015, pg.1). IPC practice models are being used in healthcare settings to improve patient care outcomes along with the use of informatics (Health Resources and Services Administration, 2013; World Health Organization (WHO), 2010; Cox & Naylor, 2013; Health Force Ontario, 2013; Hopkins, 2010; Cuff, 2013; Manos, 2012; Institute of Medicine, 2012). Future health providers need interprofessional (IP) communication and teamwork training to work collaboratively to provide patient-centred care (Hopkins, 2010). To qualify as interprofessional education (IPE), these activities must also involve students from two or more professions learning about, from and with each other to enable effective collaboration and improve health outcomes (WHO, 2010; Interprofessional Education Collaboration, 2011). Development and implementation of high-quality activities that achieve the core Interprofessional Education Collaboration (IPEC) competencies, which are required by accreditors of the US health professions programmes, are important (Begley, 2009). However, many barriers to in-person IPE have been documented consistently in the literature. Most common barriers to IPE include logistical and resource issues such as scheduling conflicts, geographical location/physical space and faculty time (Begley, 2009; Oandasan & Reeves, 2005; Lawlis, Anson & Greenfield, 2014).

In addition to overcoming barriers, educators must also select the best instructional design methods for IPE. Two common instructional methods in the IPE literature include online module learning and in-person simulation (Abu-Rish, et al., 2012). IPE using online modules as part of a stand-alone blended activity or course has been reported with mixed results regarding educator and student reactions and attitudes (McKenna, Palermo, Molloy, Williams & Brown, 2014; Blue et al., 2010). In-person IPE simulations typically have favourable effects regarding learner satisfaction, reaction, perceived authenticity and attitudes (Shoemaker, Beasley, Cooper, Perkins, Smith & Swank, 2011; Zhang, Thompson & Miller, 2011). In addition, a recent example of combining online modules and online simulation principles using virtual patient technology demonstrated feasibility and positive student reactions (Shoemaker, Platko, Cleghorn & Booth, 2014). More examples of combining online IPE methods are needed.

The purpose of this article is twofold. First is to describe the process steps for creating the Interprofessional Plan of Care – Simulated E-hEalth Delivery System (IPOC-SEEDS) simulation, to serve as a model for replication or modification. Second is to disseminate the results from this innovative online simulation, which was specifically developed to achieve IPEC and informatics competency domains whilst minimising common barriers to IPE.

Methods

Over 10 months, a team of IP faculty and students met to create the IPOC-SEEDS simulation. Figure 1 outlines the five-step process that was used to plan, develop, implement, assess and revise the IPOC-SEEDS simulation. A lead faculty member was tasked with coordinating and scheduling team meetings, updating team members and identifying work to be completed.

IPOC-SEEDS Simulati on Process Steps
Figure 1

IPOC-SEEDS Simulati on Process Steps

Step 1 Plan

The first step, plan, began with assembling an IP team including faculty from advanced practice nursing, dietetics and nutrition, health information management, occupational therapy and pharmacy; staff from informatics, teaching technology and electronic health record experts; and graduate assistants. A review of current IPE simulation literature was conducted to inform the development of the simulation. The team identified learning objectives for the IPOC-SEEDS simulation to address selected core competencies of IPEC targeting domains of teams and teamwork (TT4, TT5, TT8) and roles and responsibilities (RR4; Interprofessional Education Collaborative Expert Panel, 2011). An additional simulation objective identified by the team was for all involved professions to navigate the electronic health record (Quality and Safety Education for Nurses, n.d.). Study objectives were to assess the effectiveness, value and technology used during the IPOC-SEEDS simulation. The planning step also included determination of the content to be developed and technology systems to be used for delivery.

Step 2 Develop

The second step of the process, develop, involved development of content, assessment measures and technology utilisation (see Figure 1). Content development included a patient case, uni-professional audio patient encounters, team-led debriefing principles and assessment measures. The available technology used to develop the simulation included an electronic health record (EHR), a learning management system (LMS) and a web-conferencing system. Specific development tasks for each focus area are cross-walked in Table 1 with the review method used during each development step. The developmental tasks provide an idea of workload for each focus area. Review of these tasks mostly consisted of multiple iterations of IP development team feedback or peer review. Development took 10–30 h per week of the lead faculty’s time during the first year and approximately 2–10 h monthly for all others involved.

Table 1

IPOC-SEEDS Simulation Development

Content development. The content of the simulation included a patient case, uni-professional audio patient encounters for each profession, team-led debriefing and assessment measures. In order to create a need for team collaboration, each profession was given incomplete information about the case. For example, the nursing student audio recording was the initial assessment interview and no other team members were present so the nursing student had to share important information with the team. The decision to use audio recordings was to attain a realistic encounter that was practical and feasible whilst minimising costs. The goal for the development of the team-led debriefing content was to frame the importance of debriefing and to assure quality because the students would not have faculty oversight.

Simulation assessment measures development. Three assessment measures were developed or modified to assess the IPOC-SEEDS simulation content regarding IP knowledge gained, collaborative teamwork, team debriefing, simulation objectives and technology delivery.

Collaborative teamwork (PACT-novice). The Performance Assessment of Communication and Teamwork Novice Observer Form (PACT-novice) evaluates the five domains of team structure, leadership, situation monitoring, mutual support and communication rated on a five-point scale (1=poor, 3=average, 5=excellent). The PACT-novice tool was chosen for evaluation because it was designed and validated for novice raters and is simple to use for real-time or retrospective video observations (University of Washington, 2011).

Team debriefing (modified DASH). After an extensive search of the literature, no tools were found for evaluating student-led debriefing. Therefore, the Debriefing Assessment for Simulation in Healthcare (DASH) tool was modified (removed elements one and five) prior to observations to be more applicable to simulation and instructions for the student debriefing leaders (Brett-Fleegler, Rudolph & Eppich, 2012). The teams were scored based on a seven-point rating scale (1 = extremely ineffective/detrimental to 7 = extremely effective/ outstanding) in the following areas:

  • Element Two: Maintains an engaging learning environment

  • Element Three: Structures the debriefing in an organised way

  • Element Four: Provokes engaging discussion

  • Element Six: Helps trainees achieve or sustain good future performance

Simulation objectives and technology delivery (IPOC-SEEDS evaluation). A review of literature yielded no reliable and valid measures for assessing that the students met the IPOC-SEEDS simulation or technology delivery objectives. Therefore, the IP development team created these measures. Multiple iterations were piloted, and ultimately, the survey consisted of 34 questions: 15 demographic questions that concern profession, age, gender, ethnicity, city, state, online education experience, technology experience, EHR experience and IPE simulation experience; 10 five-point Likert-scale questions [Not at all (0–5%); Sometimes (25%); Often (50%); Most of the time (75%); Always (95–100%)] that ask about individual and team behaviour during team meetings; 4 yes/no questions that address the simulation and technology used; and 5 open-ended questions that ask about IP knowledge, evaluation of the IP simulation content and technology delivery.

Technology development and utilisation. Online technologies selected to deliver this simulation included an EHR, an LMS and a web-conferencing system. EHR documentation forms for each profession were selected or developed by the respective profession faculty in consultation with the informatics expert, avoiding free text forms and considering the level of student. The simulation was housed in an LMS delivered course that was developed to meet Quality Matters criteria and supported by teaching and learning technology experts (Quality Matters, 2014).

Step 3 Implement

The third step of the process, implement, occurred over three weeks (see Figure 1). The first week implementation activities involved introducing students to IP teamwork and technology training about an EHR, an LMS and a web-conferencing system (see Figure 2).

IPOC-SEEDS Online Simulation Methods
Figure 2

IPOC-SEEDS Online Simulation Methods

Students were also assigned to IP teams and used secure email to schedule their own team meeting times for week 3. During the second week, students individually reviewed the patient’s chart in the EHR, listened to the uni-professional audio patient encounter recording and documented as the provider in the EHR. During the third and final week, students met synchronously in IP teams to discuss and determine an IP collaborative care plan and document their team plan of care in the EHR. Afterwards, the students managed the team-led debrief of simulation, without faculty, discussing team functioning and areas for improvement. Finally, the students anonymously and voluntarily completed a short evaluation of the simulation in REDCap, a data capture system for research (Harris, Taylor, Thielke, Payne, Gonzalez & Conde, 2009). This project was approved by the investigational review board before implementation.

Step 4 Assess

The fourth step, assess, included assessment of the simulation (see Figure 1). Table 2 displays the objectives along with how they were assessed. The modified PACT-novice measure was used to assess the team meeting videos. The modified DASH measure was used to assess debriefing responses. The rest of the implementation was evaluated by student feedback on the IPOC-SEEDS evaluation or faculty review of student team content. Formal and informal student and faculty feedback was also gathered throughout to improve future simulation delivery.

Table 2

IPOC-SEEDS Simulation Objectives with Respective Assessment

Quantitative Assessment Methods. Quantitative questions from the IPOC-SEEDS evaluation regarding demographics, individual and team behaviour related to IPEC objectives and simulation effectiveness were analysed. Descriptive statistics for these items included central tendency (mean, median), dispersion (standard deviation, range, minimum and maximum score), distribution (skewness and kurtosis) and normality (histograms and distribution analysis).

Content Analysis Assessment Methods. Inductive content analysis (Patton, 1990) of the IPOC-SEEDS evaluation was completed to identify common emergent themes and patterns for student responses to two open-ended questions on the simulation assessment measure: ‘What value does this simulation have for you and your profession?’ and ‘What would you change to improve the effectiveness of this simulation in the future?’ Because technology issues were of particular interest, answers to the latter questions were further evaluated for any complaints about technology in the response. Two researchers independently coded each item. Raters were in agreement for 75.9% (n = 469/618) of comments, so raters met to discuss differing results in order to reach a consensus for all items (Patton, 1990).

Behavioural Observation Assessment Methods. Nine teams out of 36 were randomly selected for evaluation using an Internet-based software (Urbaniak & Plous, n.d.). Two validated rubrics were used by two independent student observers to retrospectively assess student’s behaviours regarding IP teamwork and the quality of student-led debriefing via videos of the online simulation and debriefing session. The two observers conducted a training session to discuss the rubrics and then independently assessed a randomly selected team to establish inter-rater reliability. During the training session, team scores between observers varied by less than 10% and inter-rater reliability was established. See Development section for further discussion on the PACT-novice and modified DASH tools that were used (University of Washington, 2011; Brett-Fleegler, Ruldolph & Eppich, 2012).

Step 5 Revise

The final step, revise, focused on the changes in IPE and IP technology based on the previous results (see Figure 1). Revisions were based on assessment results and feedback from students and faculty. Examples of content revisions include clarifying student directions and reducing the length of the IPOC-SEEDS evaluation survey. Examples of technology revisions include changing the web-conferencing system to another commercially available platform with a quicker connection time and having information technology experts available in class during the first week to aid in troubleshooting student issues.

Results

The IPOC-SEEDS was a student-directed online simulation of the care planning process for patient transition from one level of care to another, such as from the hospital to a rehabilitation facility. The simulation involves an IP team meeting and an EHR developed for academic purposes. The students schedule and coordinate their IP teams progressing through three weeks of content and simulation activities (see Figure 2).

Demographics

Students from advanced practice nursing, dietetics and nutrition, occupational therapy and pharmacy participated in the 2013 pilot (N = 100) and spring 2014 (N=106) simulations with both sets of data presented here (see Table 3). A majority of students were Caucasian (83%, 81%), female (87%, 75%) and in their 20’s (72%, 71%). Fourteen percent (fall 2013) and 27% (spring 2014) of students participated online from a location at least 3 h away from the Midwest university location. Most students rated their technology experience as ‘competent, 2–3 years’ or ‘proficient, 3–5 years’. This simulation was dominantly the student’s first EHR experience in fall 2013, but EHR experience was equally distributed for spring 2014 students.

Table 3

Demographics

Quantitative

Results for individual and team behaviour reported during the student team meetings related to IPEC objectives were positive overall (see Table 4).

Table 4

Student Evaluation Results

For both the pilot 2013 (N = 100) and spring 2014 (N = 106,) the majority of students selected ‘most of the time’ (75% or more) or ‘always’ (95% or more) to respond to the following questions: [How often did you] speak up/ advocate for your profession?; feel empowered to speak freely?; feel your opinion was valued?; contribute to the plan of care?; fulfil your professional role?; and fulfil your team role? (see Table 5).

Table 5

Assessment of Interprofessional Team Work Using the Performance, Assessment of Communication and Teamwork (PACT) Tool during an Online Simulation

For both the pilot 2013 (N = 100) and spring 2014 (N = 106). the majority of students selected ‘most of the time’ or ‘always’ to respond to the following questions: [How often did your team] introduce selves/describe profession? and keep patient’s perspective considered throughout meeting? Ninety percent (pilot 2013, N = 100) and 92% (spring 2014, N = 106) of students said that all opinions were valued during the team meeting at all times (always/95–100% of time).

Likewise, the simulation’s effectiveness was positive overall. Ninety-eight percent (pilot 2013, N = 100) and 95.3% (spring 2014, N = 106) of students found the case realistic. One hundred percent (pilot 2013, N = 100; spring 2014, N = 106) of students felt their team collaborated to come to a consensus on the plan of care. Eighty-four percent (pilot 2013, N = 100) and 92.5% (spring 2014, N = 106) of students felt that they had adequate training to debrief as a team. Ninety-eight percent (pilot 2013, N = 100) and 96.2% (spring 2014, N = 106) of students felt that it was appropriate to have students lead the team debrief.

Content Analysis. One hundred comments from the pilot and 106 comments in the simulation were received in response to ‘What value does this simulation have for you and your profession?’ The most frequent answers from both iterations concerned ‘learning roles and responsibilities’ (n = 29, n = 34, respectively) and ‘gaining interprofessional practice’ (n = 34, n = 35, respectively). ‘Improving patient care’ (n = 15, n = 7) and ‘improving confidence in communication’ (n = 5, n = 12) were also identified as valuable contributions to the learning experience.

Ninety-nine comments from the pilot and 106 comments in the simulation were received in response to ‘What would you change to improve the effectiveness of this simulation in the future?’ The most common recommendations for change from those in the pilot included ‘deliver the simulation in-person’ (n = 32) and ‘address technology issues’ (n = 28). In the second iteration, fewer students recommended to ‘deliver the simulation in-person’ (n = 17), whilst a similar number commented ‘address technology issues’ (n = 29). Overall, the percentage who identified technology issues improved from 54% in the pilot to 46% in the simulation. Technology revisions made after the pilot will be discussed in the Revise step.

Behavioural Observation

The observed teams effectively demonstrated IP teamwork during the simulation based on the majority of teams across all five domains ranging from average to excellent on the PACT-novice tool (Brett-Fleegler, Rudolph & Eppich, 2012). The most impressive domain was the students’ ability to provide mutual support where six of nine teams scored excellent. The most common example of mutual support included students from one profession asking questions to another profession about a potential part of the care plan to engage everyone on the team. There were no scores of ‘poor’, and only two teams scored ‘poor to average’ across all five domains. For the complete results of IP teamwork, see Table 5.

Team-led debriefing appeared to be effective based on scores from the modified DASH tool for all the nine teams (Brett-Fleegler, Rudolph & Eppich, 2012). Throughout the four observed elements on the DASH tool, the majority of team scores ranged from mostly effective/ good to extremely effective/outstanding. The scores demonstrated the student’s ability to maintain an engaging learning environment, keep the debrief structured and organised, provoke an engaging discussion and sustain good future performance. For the complete results of the student Team-led debriefing, see Table 6.

Table 6

Evaluation of Student Team-led Debriefing with Debriefing Assessment for Simulation in Healthcare (DASH) tool

Discussion

The online IPOC-SEEDS simulation effectively provided IP students with a collaborative team experience using the EHR that overcame the common barriers of scheduling, physical space and faculty time conflicts. The IPOC-SEEDS simulation objectives, based on IPEC and informatics competencies, were achieved. Simulation effectiveness and value were established as well. Technology utilisation results were adequate but did improve after modifying the technology selected. The simulation provided an experience where students demonstrated IP collaborative skills that they can use in their future practice.

Simulation limitations include that some assessment measures used were newly developed or modified and that students were not assessed for providing socially desirable responses. Additionally, technology can present its own limitations and challenges (Lawlis, Anson & Greenfield, 2014). For example, technologies such as LMS, EHR, and web-conferencing system are crucial, and when they fail to work appropriately, disruption occurs halting the team’s productivity and increasing frustration levels. Back up plans that include technology support should be developed. Resources should be acquired to help with technology including administration buy-in, technology staff support and possibly financial support for faculty/ facilitator time, purchasing new technology and upkeep depending on the needs of simulation.

Online simulation utilisation is not restricted to academic educators (Sutter, Arndt, Arthur, Parboosingh, Taylor & Deutschlander, 2009). Future online IPE simulation developers (hospitals, health departments, universities, conferences) can use the authors’ process steps (plan, develop, implement, assess and revise) as a model for online simulation replication. Developers can modify the process to meet their students/participants needs, simulation content and available resources.

Online delivery of IP simulation has many possible benefits for its users (IP students, IP providers; Lawlis, Anson & Greenfield, 2014). First, online IPE simulation can reduce common barriers of scheduling conflicts, limited physical space and limited faculty/facilitator time. For example, online technology can assist participants to coordinate and schedule team meeting times that work for their schedule. Online simulation eliminates the need for large amounts of physical space for collaboration and thereby also eliminates the need to reserve meeting rooms months in advance. Online self-directed participant simulations can reduce faculty/facilitator time coordinating multiple teams of participants and leading multiple simulation debriefs. Second, online technologies can support users in actively engaging in IP team collaboration and EHR utilisation to meet academic or provider accreditation requirements (Hanna, Soren, Telner, MacNeill, Lowe & Reeves, 2012). Third, the online platform provides access for distance participants to participate in effective IPE simulations where they could not participate otherwise due to long travel distances (Hanna, Soren, Telner, MacNeill, Lowe & Reeves, 2012).

Funding

Interprofessional New Clinical Investigator Research Grant and the Kansas Reynolds Program in Aging Interprofessional Faculty Scholars.

Acknowledgements

Interprofessional New Clinical Investigator Research Grant, Kansas Reynolds Programin Aging Interprofessional Faculty Scholars, The University of Kansas Center for Health Informatics, Cerner and University of Kansas Medical Centers’ Simulated E-hEalth Delivery System, The University of Kansas Medical Centers Teaching and Learning Technologies Department, The University of Kansas Center for Telemedicine and Telehealth and The University of Kansas Schools of Health Professions, Nursing, and Pharmacy.

References

  • Abu-Rish, E., Kim, S., Choe, L., Varpio, L., Malik, E., White, A. A., ...Zierler, B. (2012). Current trends in interprofessional education of health sciences students: A literature review. Journal of Interprofessional Care, 26(6), 444-51. . CrossrefGoogle Scholar

  • Begley, C. (2009). Developing inter-professional learning: Tactics, teamwork and talk. Nurse Educator Today, 29(3), 276-83.. CrossrefGoogle Scholar

  • Blue, A.V., Charles, L., Howell, D., Koutalos, Y., Mitcham, M., Nappi, J., & Zoller, J. (2010). Introducing students to patient safety through an online interprofessional course. Advanced Medical Education and Practice, 1, 107-114. . CrossrefGoogle Scholar

  • Boese, T. (2011). Advancing Care Excellence for Seniors unfolding case on Red Yoder. New York: National League for Nursing. Retrieved from http://www.nln.org/professional-development-programs/teaching-resources/aging/ace-s/unfolding-cases/red-yoder.

  • Brett-Fleegler, M., Rudolph, J., Eppich, W., et al. (2012). Debriefing assessment for simulation in healthcare: Development and psychometric properties. Simulation in Healthcare, 7(5), 288-94. . CrossrefGoogle Scholar

  • Cox, M., & Naylor, M., (Eds.). (2013). Transforming patient care: Aligning interprofessional education with clinical practice redesign. Proceedings of a conference sponsored by the Josiah Macy Jr. Foundation in January 2013; New York. Retrieved from http://macyfoundation.org/docs/macy_pubs/JMF_TransformingPatientCare_Jan2013Conference_fin_Web.pdf

  • Cuff, P.A. (2013). Interprofessional education for collaboration: Learning how to improve health from interprofessional models across the continuum of education to practice: Workshop summary. National Academies Press, 1-150. . CrossrefGoogle Scholar

  • Green, B. & Johnson, C. (2015). Interprofessional collaboration in research, education, and clinical practice: working together for a better future. Journal of Chiropractic Education, 29 (1), 1-10. . CrossrefGoogle Scholar

  • Hanna, E., Soren, B., Telner, D., MacNeill, H., Lowe, M., & Reeves, S. (2012). Flying blind: The experience of online interprofessional facilitation. Journal of Interprofessional Care, 27, 298-304. http://dx.doi.org/10.3109/13561820.2012.723071 

  • Harris, P. A., Taylor, R., Thielke, R., Payne, J., Gonzalez, N., & Conde, J. G. (2009). Research electronic data capture (REDCap) – A metadata-driven methodology and workflow process for providing translational research informatics support. Journal of Biomedical Informatics, 42(2), 377-81. . CrossrefGoogle Scholar

  • Health Force Ontario. (2007). Interprofessional care: A blueprint for action in Ontario. Retrieved from https://www.healthforceontario.ca/UserFiles/file/PolicymakersResearchers/ipc-blueprint-july-2007-en.pdf. Published 2007.

  • Health Resources and Services Administration (HRSA; 2012). Interprofessional education: Coordinating center for interprofessional education and collaborative practice. Retrieved from http://bhpr.hrsa.gov/grants/interprofessional/.

  • Hopkins, D. (Ed.). (2010). Framework for action on interprofessional education and collaborative practice. Geneva: World Health Organization. Retrieved from http://www.who.int/hrh/resources/framework_action/en/.2010

  • Interprofessional Education Collaborative Expert Panel. (2011). Core competencies for interprofessional collaborative practice: Report of an expert panel. Washington, D.C.: Interprofessional Education Collaborative. Retrieved from http://www.aacn.nche.edu/education-resources/ipecreport.pdf.

  • Institute of Medicine (IOM; 2012). Health IT and patient safety: Building safer systems for better care. Washington, DC: The National Academies Press. . CrossrefGoogle Scholar

  • Lawlis, T. R., Anson, J., & Greenfield, D. (2014). Barriers and enablers that influence sustainable interprofessional education: A literature review. Journal of Interprofessional Care, 28(4), 305-10. . CrossrefGoogle Scholar

  • Manos, E. L. (2012). Interprofessional collaborative acute care practice: Pediatrics (ICAP-Peds). Collaborative Partnership/Grant funded by Health Resources & Services Administration (HRSA) Grant Number UD7HP25056. University of Kansas School of Nursing. Kansas City, KS. 2012-2015. Google Scholar

  • McKenna, L., Boyle, M., Palermo, C., Molloy, E., Williams, B., & Brown, T. (2014). Promoting interprofessional understandings through online learning: A qualitative examination. Google Scholar

  • Nursing & Health Sciences, 16(3), 321-6. .CrossrefGoogle Scholar

  • Oandasan, I., & Reeves, S. (2005). Key elements for interprofessional education. Part 2: Factors, processes, and outcomes. Journal of Interprofessional Care, Supp 1, 39-48. . CrossrefGoogle Scholar

  • Patton, M.Q. (1990). Qualitative Evaluation and Research Methods. 2nd ed. Thousand Oaks, CA: Sage Publications, Inc. Google Scholar

  • Quality Matters. (2014). Higher Ed Program Rubric. 5th Ed. Retrieved from https://www.qualitymatters.org/rubric

  • Quality and Safety Education for Nurses (QSEN; n.d.). QSEN competencies. Retrieved from http://qsen.org.

  • Reeves, S., Goldman, J., & Oandasan, I. (2007). Key factors in planning and implementing interprofessional education in healthcare settings. Journal of Allied Health, 36(4), 231-235.Google Scholar

  • Shoemaker, M. J., Beasley, J., Cooper, M., Perkins, R., Smith, J., & Swank, C. (2011). A method for providing high-volume interprofessional simulation encounters in physical and occupational therapy education programs. Journal of Allied Health, 40(1), e15-21. PMID 21399842. Google Scholar

  • Shoemaker, M. J., Platko, C. M., Cleghorn, S. M., & Booth, A. (2014). Virtual patient care: An interprofessional education approach for physician assistant, physical therapy, and occupational therapy students. Journal of Interprofessional Care, 28(4), 365-7. . CrossrefGoogle Scholar

  • Sutter, E., Arndt, J., Arthur, N., Parboosingh, J., Taylor, E., & Deutschlander, S. (2009). Role understanding and effective communication as core competencies for collaborative practice. Journal of Interprofessional Care, 23, 41-51. http://dx.doi.org/10.1080/13561820802338579 

  • University of Washington. (2011). Performance Assessment of Communication and Teamwork Novice Observer Form. Retrieved from http://collaborate.uw.edu/sites/default/files/files/PACT_RealTime_ShortForm_Generic_110611_copyright.pdf

  • Urbaniak, G. C., & Plous, S. (n.d.) Research Randomizer [computer software]. Social Psychology. Retrieved from https://www.randomizer.org.

  • World Health Organization. (2010). Framework for Action on Interprofessional Education and Collaborative Practice. Geneva. Retrieved from http://www.who.int/hrh/resources/framework_action/en/ 

  • Zhang, C., Thompson, S., & Miller, C. (2011). A review of simulation-based interprofessional education. Clinical Simulation in Nursing, 7(4), e117-e126. . CrossrefGoogle Scholar

About the article

Received: 2016-04-03

Accepted: 2017-06-22

Published Online: 2017-09-15


Citation Information: International Journal of Health Professions, Volume 4, Issue 2, Pages 90–99, ISSN (Online) 2296-990X, DOI: https://doi.org/10.1515/ijhp-2017-0022.

Export Citation

© 2017 Kelli Lee Kramer-Jackman, Dory Sabata, Heather Gibbs, Judy Bielby, Jessie Bucheit, Sarah Bloom, Sarah Shrader. This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 3.0 License. BY-NC-ND 3.0

Citing Articles

Here you can find all Crossref-listed publications in which this article is cited. If you would like to receive automatic email messages as soon as this article is cited in other publications, simply activate the “Citation Alert” on the top of this page.

[1]
International Journal of Health Professions, 2018, Volume 5, Number 1, Page 1

Comments (0)

Please log in or register to comment.
Log in