Introduction
Medical doctors’ competencies are important because they represent key milestones in medical education, and the implementation of these competencies in educational systems varies internationally. In the United States, the Accreditation Council for Graduate Medical Education has established six core competencies with corresponding milestones, sparking a lively debate, particularly in the context of residency training [
1]. Canada developed competencies in the form of the “CanMEDS” framework, which presents a role model of a doctor, led by the Royal College of Doctors and Surgeons, with the participation of various specialists and educators. Canadian residency training programs have been incorporating these competencies since the early 1990s, predating similar initiatives in the United States [
2]. The United Kingdom has introduced competencies encapsulated by the “Good Medical Practice” guidelines and initiated outcome-based residency training in the mid-1990s [
3]. A common feature of these programs is their integration of undergraduate medical education with continuing professional development, a concept that took root in specialty training during the 1990s.
In South Korea, however, competency-based medical education (CBME) has taken a different form. Initially, CBME was introduced at the undergraduate level, with each university creating and implementing its own set of competencies, rather than adopting a unified approach through a representative federation. When it comes to major-specific competencies, the Korean Institute of Medical Education and Evaluation released a report in 2010 on the development of a standardized curriculum for Korean residency programs, known as “RESPECT 100.” Subsequently, in 2013, the Korean Academy of Medical Sciences recommended a set of desirable educational competencies for residency training as part of a study aimed at reorganizing the curriculum for each specialty to enhance the efficiency of residency training [
4,
5]. In 2019, the Ministry of Health and Welfare announced an outcome-based approach to residency training [
6]. However, research on the actual training programs and the identification of competencies within residency training has been sporadic [
7,
8]. Recognizing the need to investigate the competencies of Korean doctors beyond undergraduate education and specialty training, the “Korean Doctor’s Role” was established in 2014 through policy research [
9]. Despite this development, it has not been actively integrated with the competencies established at the undergraduate level or those for specialties. Consequently, CBME in Korea faces challenges due to the insufficient connection between the competencies of doctors, specialty competencies, and undergraduate competencies.
Although the practice of CBME varies between South Korea and other countries, a shared characteristic is that each nation has developed its own framework outlining the desired competencies for doctors. This framework aims to create a continuum between undergraduate and graduate medical education. Within this framework, the competencies expected at the undergraduate level, during residency, and at the specialist level are delineated, with the term “milestone” used to describe the competency levels at each stage [
10,
11]. Consequently, defining the competencies required of residents necessitates the establishment of a foundational framework for doctors’ competencies.
The “Korean Doctor’s Role,” which has been used as a framework for Korean doctors’ competencies, is important because it was the first formal presentation Korean doctors’ competencies. However, it has several limitations, particularly in the context of residency training and undergraduate education. These include the challenge of evaluating intangible aspects such as doctors’ attitudes and qualities; variability in the specificity of skill descriptions, with some milestones articulated as broad concepts rather than concrete competencies; ambiguous categorization of competencies due to overlapping content; and unclear descriptions that encompass multiple competencies or use undefined terms. Amidst the coronavirus disease 2019 (COVID-19) pandemic, there has been a shift towards incorporating social considerations and a patient-centered approach in healthcare. Historically, competencies for doctors, both domestically and internationally, have been predominantly devised by experts, without adequately reflecting the perspectives of patients and the broader medical community. In response to these challenges, the National Evidence-based Healthcare Collaborating Agency initiated a study entitled “How to Improve the Educational System for Residency Training to Improve Patient-Centered Outcomes.” This study necessitated a reevaluation and restructuring of the Korean Doctor’s Competency Framework, a high-level conceptual model, to define the competencies required of residents more accurately.
Therefore, this study was conducted as part of an initiative to establish a competency framework for physicians, serving as a foundation for both undergraduate and graduate medical education [
12-
17]. The development of the competency framework proceeded in stages: In Stage 1, the research team formed an expert working group and devised a plan for constructing the framework. In Stage 2, the researchers compared and analyzed both domestic and international competencies for physicians, reviewed relevant literature, social networking services (SNS), and newspaper articles, and finalized the competency framework through discussion and consensus. Stage 2 of the study primarily focused on identifying a competency framework that consists of six core competencies: (1) expertise in diseases and health, (2) communication with patients, (3) collaboration with colleagues, (4) guardianship of societal health, (5) professionalism in self-conduct, and (6) scholarship in academia. The framework is further detailed with 17 sub-competencies that are organically connected to the core competencies, and 53 enabling competencies, which are the functional units that must be demonstrable in practice. The third stage of the study involved conducting a Delphi study to validate the competency frameworks identified by the researchers. The Delphi method is the most widely used approach for decision-making by synthesizing expert opinions to reach consensus in uncertain situations. It has been employed in medical education for various purposes, including the development of curricula or assessment tools, the creation of educational resources, and the definition of competencies [
18,
19].
The purpose of this study was to validate the competencies created by the researchers under the theme of “patient-centered doctor’s competencies in Korea” from the perspective of an expert panel. To reflect the views of patients and the medical community, a large-scale survey was conducted after the Delphi survey, as well as big data analysis of newspaper articles and SNS.
Discussion
The purpose of this study was to evaluate the validity of researcher-derived, patient-centered physician competencies using the Delphi method. The findings indicated that the majority of competencies were deemed valid. However, the competencies related to societal and academic roles contained items that were considered substandard in terms of validity and consensus. Additionally, there was a significant number of comments from the expert panel regarding these competencies.
First, the competencies associated with doctors’ role in society as “health advocates” or “healthcare leaders” had low validity, with numerous dissenting opinions. Specifically, the competencies deemed to have low validity included “protecting the health of the population through public health activities in the community (Competency 32),” “improving the efficiency of the healthcare organization through a cost-effective management system (Competency 33),” and “contributing to the elimination of healthcare disparities through fair distribution and equitable utilization of healthcare resources (Competency 35).” Furthermore, the competency “being able to coordinate various healthcare services needed for patient care (Competency 24),” initially categorized under the role of “collaborators” for fellow healthcare workers, could also be considered relevant to societal roles, indicating an overall trend of low-validity items in this category. The primary reason for this is that the competencies associated with societal roles are not solely the responsibility of individual doctors; they require collaboration with other professionals such as public servants and social workers. Moreover, not all doctors are expected to engage in these activities.
Social competencies are not explicitly described in the General Medical Council’s competencies in the United Kingdom, which operates a public healthcare system [
3]. However, in the United States, the Accreditation Council for Graduate Medical Education (ACGME) competencies include “systems-based practice,” which focuses on understanding healthcare delivery systems and providing cost-effective care within the healthcare system [
1]. Similarly, the Canadian competencies include roles such as “health advocate” and “leader,” which compel physicians to address the health needs of patients or communities, manage health resources effectively, and strive for improvements in the healthcare system [
2]. However, similar to the results of this study, doctors viewed social competencies such as ACGME’s “systems-based practice” and CanMEDS “health advocate” as less important than other competencies [
21-
23]. A proposed explanation for this is that doctors who enter medical school are often not socially disadvantaged and have fewer opportunities to experience socially disadvantaged populations [
24]. To counteract this, efforts are being made to raise awareness of social competencies by offering experiences that highlight healthcare disparities and by maintaining a website dedicated to connecting clinical medicine with social determinants of health [
24].
In contrast, the growing interest in the social competence of doctors during the COVID-19 pandemic seems to have contributed positively to the validation of the “healthcare leader” competency [
25-
27]. Throughout the pandemic, doctors and medical education professionals have recognized the importance of understanding illness not merely as an individual patient issue, but within the broader context of a society’s healthcare system. They have also acknowledged the significance of offering medical support to vulnerable populations during treatment and vaccination efforts [
28,
29].
The researchers concluded that doctors bear a collective responsibility to society as “leaders” who are pivotal in healthcare organizations, both in preventing illness and treating the health of patients and communities. However, the term “social accountability” as described in the “Korean Doctor’s Role” (2014) is typically associated with providing limited support for vulnerable populations and does not fully capture the broader concept of improving patient and community health or delivering healthcare services. Therefore, this study adopted the more encompassing and general term “healthcare leader.” Additionally, given the low validity and consensus indices from the expert panel, the competencies were articulated in terms of “exercising expertise” and “contributing,” reflecting the level of responsibility that an individual doctor can and should assume in society.
Second, there was disagreement about doctors’ competencies in academia as “scholars” or “contributors to the advancement of medicine.” While there was a consensus on the importance of “continuing professional development,” many argued that engaging in “education” and “research” should be expected of doctors in tertiary hospitals, but not necessarily of all doctors. This sentiment was echoed in a survey examining perceptions of CanMEDS competencies, which showed that the scholarly competency associated with research was not only deemed the fifth least important out of seven competencies but also exhibited the largest discrepancy in perceived importance between generalists and specialists [
22]. The study concluded that doctors in primary or secondary care settings should also foster learning among peers and possess the ability to pose academic questions and seek scientific solutions within the medical field. These findings were generalized in the study’s description. However, terms such as “academic,” “scholar,” or “research” were replaced with “a person who contributes to the advancement of medicine” to avoid overwhelming general practitioners. Additionally, the phrase “contributes to the conduct of research to generate knowledge and the dissemination of its results” was removed because it suggested a direct involvement in research, which was not the intent of Competency 52.
This study has several strengths. First, we are confident that the Delphi method was effectively employed to validate the competencies of doctors. The expert panel participating in the Delphi study was diverse, including doctors from both primary and secondary hospitals, as well as university hospitals and government officials. The panel was not limited to clinicians; it also comprised experts in basic medicine, education, and administration. Furthermore, the panel members were geographically representative of the hospitals where they practiced. Second, in addition to the Delphi study, our research on competencies is comprehensive, involving the analysis of domestic and international literature, social media, and newspaper articles. We are also conducting large-scale surveys targeting citizens, nurses, medical students, residents, and specialists, and organizing public hearings to gather a wide range of perspectives.
However, there are limitations to this study. As mentioned earlier, disagreements regarding the competencies of “healthcare leader” and “contributor to the advancement of medicine” are likely to persist even if statistical validity and consensus levels are satisfactory. Although we have removed ambiguity and used language that is easier to understand, there may be difficulties with using this competency in the education of students or specialists, and the meaning may change during implementation.
The “patient-centered doctor’s competencies” developed in this study are significant for several reasons. First, the concept of competence pertains to the abilities a doctor possesses at the culmination of training. Therefore, it is anticipated that milestones—incremental competencies symbolizing the educational objectives for medical students and residents—will be established and serve as a foundation for their training. Second, the development of a competence evaluation system would enable the monitoring and individual assessment of medical students’ and residents’ competencies. Third, such an assessment system could pinpoint areas of deficiency and guide the creation of educational programs designed to address these shortcomings.
In a follow-up study, it will be necessary to actively review the validity not only through a Delphi study, which is a content validation process, but also through the statistical validation of construct validity or quasi-validity. In addition, efforts should be made to develop patient-centered competencies through comparative studies of experts’ views on doctors’ competencies and patients’ and society’s views. Above all, it is necessary to develop competencies with specificity, clarity, transparency, and applicability in mind, and to re-measure their validity while applying them to the education of students and specialists.
This study focused on developing and validating patient-centered doctors’ competencies, and it is hoped that these competencies will be used in the future for stepwise competency development, competency assessment systems, and educational program development.