DEVELOPMENT AND VALIDATION OF A STRUCTURED CLINICAL ASSESSMENT TOOL FOR ASSESSING STUDENT NURSES’ CLINICAL COMPETENCE


DEVELOPMENT AND VALIDATION OF A STRUCTURED CLINICAL ASSESSMENT TOOL FOR ASSESSING STUDENT NURSES’ CLINICAL COMPETENCE  

ABSTRACT:                    

Assessment of clinical performance contributes to academic qualifications that incorporate professional awards. The administrators of Nursing Schools are facing the problem of subjectivity in practical examination of student nurses. This is evident in examination situations in which the examiner assigns any task of choice to the student and scores the student based on his/her perception of the student’s competence in performing the task. By this, some students are exposed to more difficult tasks than others and subjective scoring, all depending on the inclination of the examiner. In response to this problem, the study developed and validated a Structured Clinical Assessment Tool (SCAT) that will make it possible for all the students to be examined on the same tasks for any examination episode and judged on the same premise. Instrumentation research design was used. One hundred and thirty seven student nurses from three Schools of Nursing in the South East Zone of Nigeria formed the sample for the study. Prior to developing the tool, a competency assessment framework was developed based on the nursing process model with the five steps of the process being the core competencies and sub skills identified for each of the core competencies. The appropriateness of the sub-skills was verified using 52 nurse educators. The care sub-skills were pooled to form the model for SCAT. The model consists of twelve activity stations which are examination points where students perform specified nursing tasks and are scored using a predetermined standard.   Initially 48 items (four per station) and their scoring guide were generated and four experienced nurse educator/managers were used to verify their appropriateness. Thirty six items survived the validation exercise using average congruency percentage. Data collected were analysed using alpha coefficient, t-test and analysis of variance. The results of the analysis confirmed the validity of the 36 items and showed that the items were able to discriminate between the high and low achievers. The high reliability index (0.84-0.99) for most of the procedure station items and moderate reliability index (0.69-0.78) for others confirm that the instrument has a good inter-scorer consistency and therefore is reliable. Based on these findings, the SCAT is a tool that has the potentials for reducing the subjectivity that is inherent in clinical assessments that are based on observation and is therefore recommended for assessing clinical competence of student nurses.

TABLE OF CONTENTS

TITLE PAGE                                                                                  ii

APPROVAL                                                                                   iii

CERTIFICATION                                                                           iv

DEDICATION                                                                                 v

ACKNOWLEDGEMENT                                                                vi

ABSTRACT                                                                                    vii

TABLE OF CONTENTS                                                                 viii

LIST OF TABLES                                                                           xi

CHAPTER ONE: INTRODUCTION

Background to the study                                                           1

Statement of the problem                                                          8

Purpose of the study                                                               11

Significance of the study                                                         12

Scope of the study                                                                  14

Research questions                                                                  15

Research hypothesis                                                               16

CHAPTER TWO: REVIEW OF RELATED LITERATURE

Introduction                                                                            17

Conceptual framework                                                            18

-   Definition of Nursing                                                         18

-   Clinical competence in nursing                                           24

-   Nursing process                                                                 24

-   Competency outcome performance assessment                 32

-   Competency assessment framework                                  34

Theoretical framework                                                            36

Organizational theories                                                         36

-Max Weber theory of bureaucracy                              36

-Getzel and Guba theory of organizational behaviour  38

Developing criterion-referenced measures                             41

-Determining conceptual framework                            43

-Explicating objectives or domain definition                44

-Preparing of test specifications                                   46

Validating clinical competency assessment tool                      50

Empirical studies on instrumentation                                     63

Summary of reviewed literature                                              74

CHAPTER THREE: RESEARCH METHODOLOGY

Introduction

Research Design                                                                      77

Area of the Study                                                                    78

Population of the Study                                                          78

Sample and Sampling Procedure                                            80

Instrument for Data Collection                                                81

Development of SCAT                                                            84

Validity of the Instrument                                                     100

Trial testing of the Instrument                                               101

Reliability testing of the Instrument                                       103

Method for Data Collection                                                   104

Method of Analysis of Data                                                   106

CHAPTER 4: PRESENTATION AND ANALYSIS OF DATA

Presentation of Data                                                                       109

Research Question 1                                                                       109

Research Question 2                                                                       118

Research Question 3                                                                       121

Research Question 4                                                                       125

Hypothesis 1                                                                                  127

Hypothesis 2                                                                                  128

Summary of findings                                                                      132

CHAPTER FIVE: DISCUSSION OF RESULTS, CONCLUSION AND RECOMMENDATION

Introduction                                                                                 134

Appropriateness of Tasks and the Activities for Assessing

Competence                                                                                  134

Validity of SCAT                                                                           135

Reliability of SCAT                                                                        136

Hypotheses testing                                                                         136

Conclusion                                                                                     138

Educational Implication of the Study                                             138

Recommendations                                                                         140

Limitations of the Study                                                                 140

Summary                                                                                       141

REFERENCES                                                                               145

APPENDICES                                                                                150

CHAPTER ONE   INTRODUCTION 

Background to the Study

Effective administration requires rational decision making which will lead to the selection of the way to reach the anticipated goal. The educational administrator in trying to achieve the ultimate goal of improving learning and learning opportunities to ensure competent products is faced with the responsibility to make decision on such issues as selecting appropriate curriculum, selecting appropriate teaching methods, and selecting appropriate methods for assessing the student’s progress. If appropriate decisions are made on these issues, appropriate educational policies will be made and the goals of education will be met.

  However if inappropriate decisions are made, particularly on methods of assessing students, the society will be exposed to the danger of incompetent practice. This is so because learners who have not acquired the necessary knowledge and skills for competent practice may be certified to be qualified to practice and may not give quality and safe care.

    Generally the school curriculum is organized to expose students to subjects that provide opportunity for them to acquire the knowledge and skills that should help them practice. Sometimes students who have passed written examination and certified fit to practice fail to do so. 

        Considering the legal and financial implications of employee performance and safe practice in a rapidly changing environment, a major concern of an educational administrator of an institution should be to produce manpower that is competent. It is therefore important in assessing students for certification to practice, in this case, in a health care institution, to generate appropriate data that will help in making decision on whether they are able to perform tasks that the knowledge they have acquired should help them to accomplish. This can be done if an appropriate assessment tool is in place.

  Stressing the importance of assessing what nursing care providers can do, not what they know, Del Bueno (1990) cited situations in which people who had performed excellently well in examination had difficulty performing a procedure or recognizing warning signs in patients experiencing difficulty. This kind of situation is unacceptable and informed the reforms in nursing education which led to calls for assessment of clinical performance to contribute to academic qualifications that incorporate professional awards. In response to this call, training institutions have developed clinical assessment tools. However, Redfern, Norman, Calman, Watson & Murrels, (2002)

expressed some concern about the psychometric quality of the tools that are available and the ability of the tools to distinguish between different levels of practice. They analyzed some tools of assessing competence to practice in nursing, while Norman, Watson, Murrels, Calman, and Redfern (2002) tested selected nursing and midwifery competence assessment tools for reliability and validity. Both team of researchers concluded that a multi-method approach which enhances validity and ensures comprehensive assessment is needed for clinical competence assessment for nursing and midwifery. 

      In order to ensure such a tool, Lenburg (2006) created a constellation of ten basic concepts and suggested that they should be adapted for developing and implementing objective performance examination. They include:

•      Concept of examination

•      Dimensions of practice

•      Required critical elements

•      Objectivity of the assessment process

•      Sampling critical skills for the testing period

•      Level of acceptability

•      Comparability in extent, difficulty and requirements

•      Consistency in implementation  Flexibility in actual clinical environment  Systematized conditions. 

    These concepts are very useful to the development of accurate assessment instruments. Thus far in the nursing context in Nigeria, such tool does not exist. The administrators of nursing schools are facing the problem of subjectivity in practical examination of student nurses. This is evident in situations where students are given different tasks to perform during clinical examination and awarded grades based on the tasks they perform. By this some students are exposed to more difficult tasks than others, all depending on the inclination of the examiner and yet judged on the same maximum score. This is unfair. It is therefore necessary to develop an assessment tool that will examine the students on the same tasks for a particular examination episode.  

    In order to accomplish this, consideration should be given to the concepts proposed by Lenburg (2006) which were mentioned earlier. To achieve objectivity in an assessment process two components must be considered. First the content (skills and critical elements) for the particular assessment should be specified in writing and second, there should be a consensual agreement of everyone directly involved in any aspects of the examination process. When individual examiners begin to digress from the established standards and protocols, objectivity erodes back into subjectivity and inconsistency. This regression destroys the process and the purpose.

 To prevent this from occurring, the educational administrator should ensure that the content of the examination is specified by the list of the dimensions of practice, that is, the skills and competencies and their required critical elements that determine the extent and conditions of competence. 

 The use of a conceptual framework to systematically guide the assessment process increases the likelihood that concepts and variables universally salient to nursing and health care practice will be identified and explicated (Waltz, Strickland & Lenz, 2005).

 Concepts of interest to nurses and other health professionals are usually difficult to operationalise, that is to render measurable. This is partly because nurses and other health professionals deal with a multiplicity of complex variables in diverse settings, employing a myriad of roles as they collaborate with a variety of others to attain their own and others goals. Hence, the dilemma that they are apt to encounter in measuring concepts is two fold; first; the significant variables to be measured must, by any means, be isolated, and second, very ambiguous and abstract notions must be reduced to a set of concrete behavioural indicators. It is therefore the responsibility of the educational administrator who knows the goals that are intended and that selected the content that should help in the achievement of the goal to select the variables that must be measured and to reduce them to concrete behavioural indicators of competence. These should be incorporated into a protocol that will guide the assessor.

 Protocols ensure that each test episode for a given group is comparable in extent, difficulty and requirements. Protocol also ensures that the process is implemented consistently, regardless of who administers the examination or when it was conducted. When performance examinations are administered in actual clinical

environment, not simulation, the concept of flexibility is essential as each client is different. The responsible educational administrator, who prepares students for professional practice is therefore challenged to develop appropriate competency-based assessment tools for use in the assessment of students’ clinical competence.

 Competency-based assessment tool focuses on measuring the actual performance of what a person can do rather than what the person knows. It is based on criterion-referenced assessment methods where the learner’s performance is assessed against a set of criteria provided so that both the learner and assessor are clear on what performance is required. Competency-based assessment technique addresses psychomotor, cognitive and affective domains of learning and its goal is to assess performance for the effective application of knowledge and skill in practice setting. The competencies can be generic to clinical practice in any setting, specific to a clinical specialty, basic or advanced (Benner,

1982; Gurvis & Grey, 1995).

 Criterion-referenced measures are particularly useful in the clinical area when the concern is the measurement of process and outcome variables as applies in nursing. A criterion-referenced measure of process according to Waltz, Strickland & Lenz (2005), requires that one identifies standards or the client care intervention and compares the subjects’ clinical performance with the standard of performance which is the predetermined target behaviour. When all these are taken into consideration in developing a clinical assessment tool, the tool is bound to be authentic.

        Statement of Problem

 In Nigeria, assessment of clinical performance contributes to the academic qualification for professional award. The Nursing and

Midwifery Council of Nigeria (NMCN) has adopted the Objective Structured Clinical Examination (OSCE) for midwifery but has not done the same for general nursing examination. The tool that is currently in use for clinical assessment for the general nursing examination leaves a lot to be desired. It lacks the comparability and consistency that are required to make an assessment tool objective and fair hence the need for a structured clinical assessment tool. Some of the pitfalls of the tool include;

•      The tool makes allowance for the selection of the procedure to be performed by the candidate to be made by the assessor and this is varied from one candidate to another. The implication is that all the candidates do not perform the same tasks and the tasks they perform are not comparable and since the task difficulty is not the same for all tasks, the candidates are not examined nor judged on the same premise. This is unfair.

•      Another problem that is closely linked with not specifying tasks that all candidates must perform is that the mark allotted to the item, “procedure” is the same for all procedures whether simple or complex and since some candidates are assigned simpler tasks than those assigned to others and are judged on the same optimal score for less work, the tool is unfair. Again, because the activities expected to be carried out for each procedure is not specified, the scoring of the candidates’ performance is based on what the scorer thinks is right and this may vary from one scorer to another. The implication is that most times, the scoring is subjective.

•      Sometimes, the length of time required to accomplish a certain task the assessor assigned to a candidate to perform may not allow the assessor opportunity to assess the candidate on all the areas that are listed on the clinical performance assessment guide. Since all the items sum up to give the maximum score, it creates the difficulty of determining what to do about scoring those items particularly as it was not the fault of the candidate that he was not examined in those areas by the particular assessor.

•      Again, some of the criteria on which the candidates are judged are not stated in specific terms. For example such statements as “handles patients gently and skillfully” and “adapts the environment for the patient’s comfort” are not specific enough as to what the candidate is expected to do and therefore leaves room for assessor’s subjective conclusions. The implication of all these is that some of the results of assessments using this kind of tool are not valid and may have negative impact on the candidate who failed when actually he/she should have passed and on the consumers of nursing care where a candidate who had not acquired the necessary skills for competent and safe practice passed when he/she should have failed.

In view of this problem, there is the need to develop a clinical assessment tool that is objective and fair. This is the intent in this study. 

Purpose of the Study

 The main purpose of the study is to develop and validate a structured Clinical Assessment Tool which will provide opportunity for all the students to be examined on the same tasks for a particular examination period and be scored based on predetermined performance criteria. This will ensure a fair, objective and valid assessment of student nurses’ clinical performance.

Specifically the objectives are to:

1.     develop appropriate tasks for assessing student nurses’ clinical competence.

2.     develop appropriate activities for determining competency in the tasks

3.     determine the content validity of the Structured Clinical Assessment Tool (SCAT) that was developed

4. determine the construct validity of the Structured Clinical

Assessment Tool (SCAT).

5. determine the inter-rater reliability of the SCAT.

Significance of the Study

 The study will result in the availability of an instrument for a more comprehensive and objective clinical assessment of student nurses. Because the instrument will cover the core practice competency areas in nursing, it will be useful in determining whether or not student nurses have acquired the complex repertoire of knowledge, skills and attitudes required for competent practice before they enter the profession. The instrument will be useful to nurse educators and clinical supervisors/managers of health care institutions who are preparing students for practice because it will show them the core elements of competence in nursing and thus help them to guide the students appropriately to acquire the skills necessary to become competent and safe. It will also be useful to the students because they will know from the start what is expected of them, and being focused, they will work toward success.

        The instrument will eliminate the problem of leaving the

candidates to the whims and caprices of their assessors which results in some candidates carrying out more complex tasks than others and yet judged on equal score. Instead, the candidates will perform the same and specified tasks. This way, the candidates will be examined on the same premise and any judgment made on the results that are generated by the instrument will be worthy and valid.

 Again, because the instrument has broken down the elements of competence into performance criteria on which the performance can be judged acceptable, scoring of students’ performance during assessment will be easier and will be devoid of subjectivity and therefore will make the result more authentic. The tool will serve as an impetus for the Nursing and Midwifery Council of Nigeria (NMCN) to revise the tool that is currently in use for the final qualifying examination to become more objective and fair. If this is done only those who have acquired the necessary knowledge and skill will be certified competent and licensed to practice and the consumers of nursing care will be sure to receive quality and safe care. The tool will also be a reference for other researchers who will want to develop tools that will address procedures that are not accommodated in the present study.

The Scope of the Study

 The study is delimited to developing a structured clinical assessment tool, developing a scoring scheme for the tool, establishing the content and construct validity of the tool and determining the internal consistency reliability of the tool. Only the average congruency percentage for determining content validity; mean and standard deviation of contrasted groups for determining the construct validity, as well as the internal consistency reliability using index of inter-rater agreement were determined.

      The clinical events that were assessed were limited to procedures that would be completed within 5 minutes. This was to ensure that the students are assessed on a good variety of events within the one hour they are normally assessed during practical exams. Exposing them to procedures that take longer will limit the number of events that they will be assessed on. The tool however presupposes that the students would have been assessed (using a structured assessment tool) on those procedures that take longer time to accomplish prior to this final assessment.

 Though the tool is developed for assessing clinical competence of student nurses in Nigeria, the validation of the instrument was conducted in the South East zone of Nigeria using three randomly selected Schools of Nursing.

Research Questions

The study is guided by the following Research Questions

1.     How appropriate are the tasks of SCAT for assessing student nurses’ clinical competence?

2.     How appropriate are the activities for determining competence in the selected items?

3.     How valid is the content of SCAT?

4.     What is the inter-rater reliability coefficient of SCAT?

Hypotheses

The following hypotheses were tested at an alpha of 0.05

 Ho1: There is no significant difference in the mean scores on SCAT               of high and low achievers.

Ho2: There will be no significant difference in the scores of the               students on any of the procedure stations of SCAT as determined              by the three assessors.

.

DEVELOPMENT AND VALIDATION OF A STRUCTURED CLINICAL ASSESSMENT TOOL FOR ASSESSING STUDENT NURSES’ CLINICAL COMPETENCE



TYPE IN YOUR TOPIC AND CLICK SEARCH.






RESEARCHWAP.ORG

Researchwap.org is an online repository for free project topics and research materials, articles and custom writing of research works. We’re an online resource centre that provides a vast database for students to access numerous research project topics and materials. Researchwap.org guides and assist Postgraduate, Undergraduate and Final Year Students with well researched and quality project topics, topic ideas, research guides and project materials. We’re reliable and trustworthy, and we really understand what is called “time factor”, that is why we’ve simplified the process so that students can get their research projects ready on time. Our platform provides more educational services, such as hiring a writer, research analysis, and software for computer science research and we also seriously adhere to a timely delivery.

TESTIMONIES FROM OUR CLIENTS


Please feel free to carefully review some written and captured responses from our satisfied clients.

  • "Exceptionally outstanding. Highly recommend for all who wish to have effective and excellent project defence. Easily Accessable, Affordable, Effective and effective."

    Debby Henry George, Massachusetts Institute of Technology (MIT), Cambridge, USA.
  • "I saw this website on facebook page and I did not even bother since I was in a hurry to complete my project. But I am totally amazed that when I visited the website and saw the topic I was looking for and I decided to give a try and now I have received it within an hour after ordering the material. Am grateful guys!"

    Hilary Yusuf, United States International University Africa, Nairobi, Kenya.
  • "Researchwap.org is a website I recommend to all student and researchers within and outside the country. The web owners are doing great job and I appreciate them for that. Once again, thank you very much "researchwap.org" and God bless you and your business! ."

    Debby Henry George, Massachusetts Institute of Technology (MIT), Cambridge, USA.
  • "Great User Experience, Nice flows and Superb functionalities.The app is indeed a great tech innovation for greasing the wheels of final year, research and other pedagogical related project works. A trial would definitely convince you."

    Lamilare Valentine, Kwame Nkrumah University, Kumasi, Ghana.
  • "I love what you guys are doing, your material guided me well through my research. Thank you for helping me achieve academic success."

    Sampson, University of Nigeria, Nsukka.
  • "researchwap.com is God-sent! I got good grades in my seminar and project with the help of your service, thank you soooooo much."

    Cynthia, Akwa Ibom State University .
  • "Sorry, it was in my spam folder all along, I should have looked it up properly first. Please keep up the good work, your team is quite commited. Am grateful...I will certainly refer my friends too."

    Elizabeth, Obafemi Awolowo University
  • "Am happy the defense went well, thanks to your articles. I may not be able to express how grateful I am for all your assistance, but on my honour, I owe you guys a good number of referrals. Thank you once again."

    Ali Olanrewaju, Lagos State University.
  • "My Dear Researchwap, initially I never believed one can actually do honest business transactions with Nigerians online until i stumbled into your website. You have broken a new legacy of record as far as am concerned. Keep up the good work!"

    Willie Ekereobong, University of Port Harcourt.
  • "WOW, SO IT'S TRUE??!! I can't believe I got this quality work for just 3k...I thought it was scam ooo. I wouldn't mind if it goes for over 5k, its worth it. Thank you!"

    Theressa, Igbinedion University.
  • "I did not see my project topic on your website so I decided to call your customer care number, the attention I got was epic! I got help from the beginning to the end of my project in just 3 days, they even taught me how to defend my project and I got a 'B' at the end. Thank you so much researchwap.com, infact, I owe my graduating well today to you guys...."

    Joseph, Abia state Polytechnic.
  • "My friend told me about ResearchWap website, I doubted her until I saw her receive her full project in less than 15 miniutes, I tried mine too and got it same, right now, am telling everyone in my school about researchwap.com, no one has to suffer any more writing their project. Thank you for making life easy for me and my fellow students... Keep up the good work"

    Christiana, Landmark University .
  • "I wish I knew you guys when I wrote my first degree project, it took so much time and effort then. Now, with just a click of a button, I got my complete project in less than 15 minutes. You guys are too amazing!."

    Musa, Federal University of Technology Minna
  • "I was scared at first when I saw your website but I decided to risk my last 3k and surprisingly I got my complete project in my email box instantly. This is so nice!!!."

    Ali Obafemi, Ibrahim Badamasi Babangida University, Niger State.
  • To contribute to our success story, send us a feedback or please kindly call 2348037664978.
    Then your comment and contact will be published here also with your consent.

    Thank you for choosing researchwap.com.