false
Catalog
The Common Clinical Assessment Tool (CCAT) for Gen ...
Common Clinical Assessment Tool (CCAT) for Generic ...
Common Clinical Assessment Tool (CCAT) for Generic Paper-Based Users
Back to course
[Please upgrade your browser to play this video content]
Video Transcription
Hi, my name's Laura Bonanno, and I was part of a special interest group that developed the Common Clinical Assessment Tool. I'm pleased to present to you today the Common Clinical Assessment Tool Guidelines for Clinical Educators. The mission of the Council on Accreditation is to grant public recognition to nurse anesthesia programs and institutions that award post-master's certificates, master's, and doctoral degrees that meet nationally established standards of academic quality, quality assessment. And it's also to assist programs and institutions in improving educational quality. The outcomes for this webinar are to describe the significance of sRNA clinical evaluation, to identify the COA doctoral requirements for clinical evaluation, to describe the components of the Common Clinical Assessment Tool, and also to discuss the timeline for completion of the tool. In 2015, when the special interest group for the Common Clinical Assessment Tool was developed, a survey was sent to program administrators to identify the weaknesses of the program's current clinical evaluation tool and also the strengths. Some of the information that came from that survey was that the current evaluation tools used by programs was too long, that clinical instructors did not complete the evaluation tool, and if they did complete it, it was not completed in a timely manner. Also that clinical educators did not comment when there was a problem with students that existed or that they did not write comments. And another weakness was that the evaluation tool was still being done on paper. Some of the challenges that program administrators encountered with obtaining valid and reliable clinical evaluations from preceptors was that CRNAs were afraid to write anything that could be perceived as negative and that preceptors did not complete the evaluation. So overall, the rate of return on the daily evaluations was low. Clinical educators wanted to be anonymous in their evaluations, and they also said that, again, the tool was too long, still on paper. CRNAs do not understand how to do a clinical evaluation. Another concern was that there was no security with paper evaluations at the hospital site meaning that others could see the evaluations. So they really wanted the security of an online evaluation tool. In addition, we found that a single evaluation tool that has core competencies based on COA standards and that allows a program to insert other questions based on their individual program requirements was desirable. It was also noted that the tool should be concise, be available electronically and via paper, and have a section where clinical preceptors could include comments. The use of this tool would be optional. As some programs stated, they still wanted to continue using their current clinical evaluation tool. So it was determined that program administrators did not want the common clinical assessment tool usage to be mandated. And then also that the cost of using the tool should not be prohibitive to programs. So the council decided that the use of the common clinical assessment tool would not cost programs anything. The use of the tool would be free. But we also determined that training was needed for both programs and clinical educators to ensure that the tool was being used appropriately and that we could get the effective and efficient evaluation of our students. Some of the other sources of information in addition to that survey data that informed the common clinical assessment tool were the practice doctorate standards, which were initially from 2015 but have since been updated. So we make sure that the tool aligns with the current standards. We also looked at the AACN common APRN doctoral competencies. We looked at some profile questions from the AANA. We looked at the NBCRNA national certification exam content outline. We looked at graduate CUSIN competencies, IPEC core competencies, and we also relied on an article by Englander in 2013 that was entitled Towards a Common Taxonomy of Competency Domains for the Health Professions and Competencies for Physicians. And this, actually, this competency-based article has been used by other accreditation agencies as well. But we found it to be very informative when we were developing our tool. So we wanted to determine content validity of the tool. So we decided to do a DELPHI study, and we did get IRB approval through LSU Health Sciences Center. Those that were included as participants in the DELPHI study were program administrators, and they had to have a doctoral degree and a minimum of one year of experience as a program administrator. We also surveyed program faculty, again, requirements of a doctoral degree and minimum of one year of experience. Clinical educators, we preferred a doctoral degree, but we know that the majority had a master's degree, so a master's degree was required and a minimum of one year of experience in providing clinical education, and we included students as well. And for students to be included, they had to have completed one year of clinical education and be in good academic standing. So if you look at the COA standards, specifically 1.1, it says that formative and summative evaluations of each student are conducted for the purpose of counseling students and documenting student achievement. And the daily clinical evaluation is considered the formative evaluation. And then programs are also required to do summative evaluations, usually at the midpoint of a semester and at the end of the semester. And this particular tool can be used for both formative evaluations or that daily clinical evaluation, and it can also be used for the summative evaluation as well, should the program decide to do that. Clinical evaluations are critical to the nurse anesthesia program. It's important in tracking the progression of student-registered nurse anesthetists, and a common clinical assessment tool is beneficial because certain clinical sites may have multiple programs using that clinical site, so a more standardization of the tool that's used was important. It also allows assessment of theory versus clinical performance. And again, this tool, as you'll see when we move forward, is competency-based. The tool is also significant because it has the potential to benefit not only the individual students, the program faculty, and the clinical preceptors, but also the institutions. So the nurse anesthesia program will benefit because we're getting better quality of the evaluations, more objective evaluations, and the evaluation is based on actual competencies and linked to the council and accreditation standards. So overall, getting that good, solid, objective feedback will benefit society as well, including patients and the healthcare system. So mandates for evaluation come from all different agencies and accrediting bodies. So we see it from the professional side. We also see that from the accrediting body. And we also have to have that because our students have to graduate from an accredited program in order to be able or eligible for certification. And the overall goals of evaluation are to define clinical outcomes, to validate behaviors, to provide the student and the program with feedback on the student's clinical performance. It's also important so that we can identify any issues with the students early, so that if we need to do any kind of remediation activities or further education of that particular student, we can do that. And also to improve overall quality and safety. So in developing the current Common Clinical Assessment Tool, we looked at some of the challenges that had been expressed in our surveys and also in some of the literature that we reviewed. So we know that some of the evaluation tools lacked clarity. They lacked validity. We had an issue with timeliness. We did not have known reliability of the current tool. And we know that the clinical evaluation is inherently subjective. So those tools that were being used lacked objectivity. And we really wanted to improve the overall effectiveness of the clinical evaluations of our students. The Council on Accreditation of Nurse Anesthesia Educational Programs practice doctorate standards are designed to prepare graduates with competencies for entry into practice. So entry into practice competencies for the nurse anesthesia professional prepared at the practice doctorate level are those required at the time of graduation to provide safe, competent, and ethical anesthesia and anesthesia-related care to patients for diagnostic, therapeutic, and surgical procedures. And we know that the entry into practice competencies are really just the foundation that nurse anesthetists will continue to acquire more skills, knowledge, and abilities as they practice after graduation. So this is the foundation. So taking all of those things into account, the Common Clinical Assessment Tool Special Interest Group developed domains. We developed descriptors for each domain. And then we developed competencies for each domain and descriptor. And then progression indicators for each of those competencies. And this will make more sense when you actually see the tool. I know it's kind of a lot to take in at this point in time. But the progression indicators first included a description of unsafe, novice, advanced beginner, competent, and proficient. And proficient is deemed prepared for entry into practice. We revised that to have the first indicator that was initially termed unsafe to safety concerns. So when we did the Delphi study, which you'll see in just a minute, those were one of the changes that was made along the way. So in the Delphi study, the Special Interest Group members were assigned to work groups by domain. And the Delphi study was three rounds of feedback and revisions. And after each round, we had statistical analysis for the domain. So initially, we developed what we thought was pretty sound. The domains, the domain descriptors, et cetera. And then we put that together. And that was sent to the panel. That included, again, the program administrators, faculty, students. So each time when we sent that out, we got the feedback and we made revisions. So the groups met each time. They looked at the feedback on the domains, the domain descriptors, the competencies, and the progression indicators, and made those changes. And once those changes were made, it was sent out for the second round. The same thing happened again. We took the feedback, saying, OK, we're getting closer, but we didn't quite get it right. And then by the third round, when we got the feedback, we felt like we had kind of reached saturation and had a tool that was valid and reliable. So as you can see here, this is just the Delphi process. So again, we did three rounds. And then the final tool that you see today is based on those final revisions. We did have a statistician work with us as well to make sure that we had the analysis done accurately. So here are the domains that we developed on the final version. So there are four domains. The first domain is patient safety and perianesthesia care. And the domain descriptor for that is that it administers and manages comprehensive, safe, and patient-centered anesthesia care across the lifespan for a variety of procedures and physical conditions. The second domain, knowledge and critical thinking. The descriptor is comprehends, synthesizes, applies, and evaluates new and existing knowledge and experiences that guides clinical anesthesia decision making. The third domain, professional communication and collaboration, engages in effective communication with patients, their families, significant others, and other health care professionals to deliver safe, patient-centered anesthesia care. And the fourth domain, professional role, practices in a responsible and accountable manner that complies with professional, legal, ethical, and regulatory standards with an awareness and responsiveness to the larger health care system. So these are the four domains and the descriptors for each domain. And then what you see here on the actual tool itself, and we'll go through the tool, but you can see here, the domain will be at the top, but you can see the competency here. So this is just for one example. The first competency is provide safe and vigilant patient care throughout the perianesthesia care. And underneath that is a descriptor for that competency. So what do we mean by that? And the descriptor is timely response to alarms, audible indicators, anesthesia and or surgical events, and limited distractions. So this provides the clarity of what the intention of that competency is by the descriptor. As you go across the top here are the progression indicators. So remember the first one is safety concern. Then we go to novice, advanced beginner, competent, and then finally proficient, which is entry into practice. And so what you'll see kind of consistent across all of these is that the safety concern is going to kind of begin with fails to. So here you can see fails to demonstrate safe practices throughout the perianesthesia period. When you go to the novice indicator, demonstrate safe practices throughout the perianesthesia period with continual direction. So the novice student is a student just beginning clinical training. When they move into the advanced beginner, they can demonstrate safe practices throughout the perianesthesia period with minimal direction. So they're moving forward. As you get to competent, then you'll see we're looking for independently identify safety concerns. And then when the student gets to proficient, ready for graduation, they identify anticipate safety concerns and intervene if others are demonstrating unsafe practices. So you can see how it progresses based on the student's level in the program. So this is the tool itself. And what you're seeing here is going to be the paper copy. And so we'll kind of go through it. I won't clearly read each specific competency, but I want to point out some examples and how this would be filled out by the clinical educator. So again, here, the first one, which we just kind of went over was to provide safe and vigilant care. We talked about the progression indicators going across. What you would simply do as a clinical educator is check the box based on your evaluation of the student. Most programs will allow on the beginning of the tool for the level of the student to be identified. And so that would be something that would be added where you would identify what the student's level in the program was so that you would kind of know where they should be. The second competency under this first domain descriptor, so at the top again, the domain, this is under patient safety and peri-anesthesia care, the descriptors that we just went over. The first one that I showed you on the previous slide was here. So this is the second competency, which performs a comprehensive pre-anesthesia equipment checklist. The examples here verifies availability and function of standard and emergency equipment and performs the required anesthesia machine check. Same thing here when you're going across. The safety concern is that they fail to do that. Because if they fail to do that, even if they are a novice, this was something that they would have had in their training before going to clinical. And so that would be a safety concern if that was just completely omitted. But as you go across here, you can see you'll start off with the performs the comprehensive peri-anesthesia equipment check with minimal direction as a novice. When you move to advanced beginner independently, when you move to competent, they identify and report concerns. And at the proficient level, again, ready for graduation, they troubleshoot and resolve concerns. So you'll see this same progression for each particular domain in each competency that's listed. The same thing going across the progression indicators and we did bold so that you know kind of what's changed for that particular competency. There are some where you'll see this grayed out area, which means that you are not able to check these two. And you'll see this a good bit under the professional role domain. But for this one in particular, delivers culturally competent peri-anesthesia care. And you can see those examples underneath that they incorporate cultural awareness, knowledge, sensitivity, and skills. They recognize their own perspective and biases as well as the patients and that they include that in their decision making. So the safety concern is that they just fail to deliver culturally competent peri-anesthesia care. And then there really is no novice and advanced beginner. It's kind of one of those things where either you do or you don't. So you're not going to be able to select novice or advanced beginner. They're either going to be competent or proficient. Continuing on with domain one under patient safety, you can see here that the next several competencies are aligned with administers anesthesia for a variety of procedures and physical conditions in patients across the lifespan. And we have here induction, maintenance, emergence, and post-operative care. Again, with a descriptor underneath each one. So you can clearly see what we're looking for to evaluate that particular competency. And again, going across here, the progression is very similar for each of those competencies. And this is the last competency under the patient safety and peri-anesthesia care. Number six is specific to regional anesthesia techniques. And again, the examples are the description underneath there. Specific to regional anesthesia, the ability to verbalize indications, contraindications, and risks. Verifying availability and function of standard and emergency equipment. Identifying anatomic landmarks. Appropriately selecting and administering the anesthetic medications. Use of sterile technique. And then appropriate administration of the regional block and the ability to identify and manage complications. So again, you'll see that going across. Failing to do so would be the safety concern. With continued direction being a novice, minimal direction being an advanced beginner independently at competent. And then advocating for the use of regional anesthesia techniques for patients being at the proficient or ready for graduation level. Moving into the second domain, which is knowledge and critical thinking. And this is going to be the same in the whole tool. You'll see the domain. The domain descriptor that we reviewed previously. And then the competencies. So here, the first competency uses knowledge, experience, and science-based principles to formulate an anesthesia plan. For this particular competency, under the safety concern is they fail to use that knowledge to develop a basic anesthesia plan. When you move into novice, they use the knowledge to develop a basic anesthetic plan with minimal direction. Advanced beginner, we're expecting them to do an individualized anesthetic plan with minimal direction. When they move into competent, they can do all of that independently. And then when you move to proficient, we're looking at the use of interprofessional collaboration. So you can see the same type of progression for each of these competencies underneath here. The competency itself, and then the descriptor or the examples underneath it. Along the same lines with this knowledge and critical thinking domain, we move into the ability to interpret data using noninvasive and invasive monitoring. We also look at the ability to calculate, initiate, and manage fluid and blood component therapy. And then also the recognition, evaluation, and appropriate management of physiological responses during anesthesia care. Again, you can see the same type of progression going across with these indicators. And under knowledge and critical thinking, the last competency is the recognition and appropriate management of complications during anesthesia care. And again, this is kind of one of the ones that we talked about earlier with cultural competence. You will not be able to select novice or advanced beginner because either they have a safety concern, meaning they fail to recognize and appropriately manage complications, or they're competent that they can recognize and manage those complications independently. So that they, because a lot of this is based on, you know, just standard monitoring that we're doing intraoperatively. And I should have mentioned previously under the first domain, after each domain here is the ability for the clinical preceptor to include comments. So any individualized comments or written comments would be included here. The third domain is professional communication and collaboration. Again, here's the domain descriptor here and the competencies all align with communication. So the first one being utilized as communication skills with patients, their families, significant others, and other healthcare professionals. The example here is that they accept instruction and constructive feedback, and that they use effective, empathetic, and respectful verbal and nonverbal communication. So again, another example of one of those competencies where either they do or they don't. So if they don't, they fail to utilize effective communication skills with patients, other caregivers, and their families. That's going to be a safety concern. They should be able to utilize effective communication skills, and that would be at the competent level. Or if they're moving on to using it with interprofessional collaboration, then that would be at the proficient level, which would be expected at graduation. All of the competencies in this particular domain, again, are going to relate to either professional communication or collaboration. We have here the maintaining comprehensive, accurate, and legible healthcare records. Some of our clinical sites still are using paper charting, so that's still important. And the progression indicators, just as we've shown in the previous ones, we also include in here transferring the responsibility of patient care that assures continuity and patient safety. So this is either report to the nurse and the post-anesthesia care unit or to ICU. And then the fourth one, providing leadership that facilitates intraprofessional and interprofessional communication and collaboration. And here, the only one that they cannot select is the novice, so we expect them to have some ability to do that. But the first one that you would be able to select short of the safety concern would be the advanced beginner, that they would do that with minimal direction. This is just a few more examples underneath that particular competency. And again, here, the ability to include narrative comments. And the fourth domain is the professional role, and the descriptor is listed here, which we've already talked about, but practices in a reasonable and accountable manner that complies with legal, ethical, and regulatory standards with an awareness and responsiveness to the larger healthcare system. So in this particular domain, you'll see this a lot, where you're not able to select novice or advanced beginner. And the reason for that is all of the students in nursing anesthesia programs are already professional nurses and licensed nurses who should have some knowledge of what's required in the professional role. So we start off with the first competency is adheres to AANA and ANA codes of ethics. So either they fail to adhere to those code of ethics or they do adhere. And if they're ready to graduate, we would expect them to do a little bit more interprofessionally. The next competency is adheres to AANA standards for practice. Again, they don't adhere to the standards or they're competent. And some of, again, the comments and the reasons why we included these competencies and the way that we did the progression indicators are again, based on the feedback that we had from clinical educators and program administrators and faculty that it was often difficult because many of the programs were using a Benner model, which was the novice to expert model. And so if they had a novice student who wasn't following the standards, then how did you score that on the evaluation tool? So this is why we think that this particular safety concern is really an important asset of this common clinical assessment tool. So as you move down, you'll see the next one is interacts with professional integrity. Again, safety concern that they fail to, but they're expected to interact with professional integrity. And then we talk about the professional legal and regulatory standards and that they adhere to institutional policies. So if they violate a certain or they don't comply with a professional legal regulatory standard or policy, then that's going to be the safety concern. But we would expect them to comply and accept responsibility and accountability for their practice and to provide cost effective anesthesia care regardless of their level in the program. And again, the ability to include comments here. So that is an overview of the common clinical assessment tool. We hope that you'll find this tool easy to use and we hope that it'll help you in providing objective and better quality feedback to the students. The students really rely on that feedback and so do the nurse anesthesia programs. Thank you very much for your time and I hope you find the common clinical assessment tool very valuable as in your role as clinical educator for our students. Thanks for all you do for the nurse anesthesia students.
Video Summary
The video transcript features Laura Bonanno discussing the Common Clinical Assessment Tool Guidelines for Clinical Educators. The tool aims to improve the quality of clinical evaluations for nurse anesthesia students. The tool was developed in response to identified weaknesses in existing evaluation methods, such as lack of completion, timeliness issues, and subjectivity. The tool is competency-based and aligns with professional and ethical standards. It includes four domains: patient safety, knowledge and critical thinking, professional communication and collaboration, and professional role. Each domain has specific competencies with progression indicators from safety concerns to proficiency. The tool allows for objective evaluation, feedback, and tracking of student progress. The Delphi study was conducted to ensure the tool's validity and reliability. Training is recommended for program administrators and clinical educators to effectively use the tool.
Keywords
Clinical Assessment Tool
Nurse Anesthesia
Competency-based Evaluation
Clinical Educators
Patient Safety
Delphi Study
10275 W. Higgins Rd., Suite 500, Rosemont, IL 60018
Phone: 847-692-7050
Help Center
Contact Us
Privacy Policy
Terms of Use
AANA® is a registered trademark of the American Association of Nurse Anesthesiology. Privacy policy. Copyright © 2024 American Association of Nurse Anesthesiology. All rights reserved.
×
Please select your language
1
English