Skip header and navigation

Refine By

14 records – page 1 of 2.

OSCE checklists do not capture increasing levels of expertise.

https://arctichealth.org/en/permalink/ahliterature200451
Source
Acad Med. 1999 Oct;74(10):1129-34
Publication Type
Article
Date
Oct-1999
Author
B. Hodges
G. Regehr
N. McNaughton
R. Tiberius
M. Hanson
Author Affiliation
Department of Psychiatry, Faculty of Medicine, University of Toronto, Ontario, Canada. brian.hodges@utoronto.ca
Source
Acad Med. 1999 Oct;74(10):1129-34
Date
Oct-1999
Language
English
Publication Type
Article
Keywords
Analysis of Variance
Clinical Clerkship
Clinical Competence
Education, Medical - methods
Educational Measurement - methods
Family Practice - education
Humans
Internship and Residency
Mental Disorders - diagnosis
Ontario
Psychiatry - education
Reproducibility of Results
Abstract
To evaluate the effectiveness of binary content checklists in measuring increasing levels of clinical competence.
Fourteen clinical clerks, 14 family practice residents, and 14 family physicians participated in two 15-minute standardized patient interviews. An examiner rated each participant's performance using a binary content checklist and a global process rating. The participants provided a diagnosis two minutes into and at the end of the interview.
On global scales, the experienced clinicians scored significantly better than did the residents and clerks, but on checklists, the experienced clinicians scored significantly worse than did the residents and clerks. Diagnostic accuracy increased for all groups between the two-minute and 15-minute marks without significant differences between the groups.
These findings are consistent with the hypothesis that binary checklists may not be valid measures of increasing clinical competence.
PubMed ID
10536636 View in PubMed
Less detail

Process-rating forms versus task-specific checklists in an OSCE for medical licensure. Medical Council of Canada.

https://arctichealth.org/en/permalink/ahliterature204104
Source
Acad Med. 1998 Oct;73(10 Suppl):S97-9
Publication Type
Article
Date
Oct-1998
Author
R K Reznick
G. Regehr
G. Yee
A. Rothman
D. Blackmore
D. Dauphinée
Author Affiliation
University of Toronto, Centre for Research in Education, Toronto Hospital, ON, Canada.
Source
Acad Med. 1998 Oct;73(10 Suppl):S97-9
Date
Oct-1998
Language
English
Publication Type
Article
Keywords
Canada
Education, Medical
Educational Measurement - methods
Humans
Licensure, Medical
Pilot Projects
PubMed ID
9795665 View in PubMed
Less detail

Effectiveness of telehealth for teaching specialized hand-assessment techniques to physical therapists.

https://arctichealth.org/en/permalink/ahliterature196953
Source
Acad Med. 2000 Oct;75(10 Suppl):S43-6
Publication Type
Article
Date
Oct-2000
Author
W. Barden
H M Clarke
N L Young
N. McKee
G. Regehr
Author Affiliation
Department of Rehabilitation Services, Hospital for Sick Children, Toronto, Ontario, Canada. wendy.barden@sickkids.on.ca
Source
Acad Med. 2000 Oct;75(10 Suppl):S43-6
Date
Oct-2000
Language
English
Publication Type
Article
Keywords
Analysis of Variance
Chi-Square Distribution
Clinical Competence
Education, Distance
Educational Measurement
Hand - physiology
Humans
Ontario
Physical Therapy Modalities - education
Teaching Materials
PubMed ID
11031170 View in PubMed
Less detail

Who should rate candidates in an objective structured clinical examination?

https://arctichealth.org/en/permalink/ahliterature212799
Source
Acad Med. 1996 Feb;71(2):170-5
Publication Type
Article
Date
Feb-1996
Author
J A Martin
R K Reznick
A. Rothman
R M Tamblyn
G. Regehr
Author Affiliation
Department of Surgery, University of Toronto Faculty of Medicine, Ontario, Canada.
Source
Acad Med. 1996 Feb;71(2):170-5
Date
Feb-1996
Language
English
Publication Type
Article
Keywords
Analysis of Variance
Canada
Clinical Medicine - education - standards
Educational Measurement - methods
Educational Technology
Humans
Medical History Taking
Mental Recall
Patient Simulation
Physicians
Pilot Projects
Reproducibility of Results
Videotape Recording
Abstract
To determine who is the better rater of history taking in an objective structured clinical examination (OSCE): a physician or a standardized patient (SP).
During the 1991 pilot administration of an OSCE for the Medical Council of Canada's qualifying examination, five history-taking stations were videotaped. Candidates at these stations were scored by three raters: a physician (MD), an SP observer (SPO), and an SP rating from recall (SPR). To determine the validity of each rater's scores, these scores were compared with a "gold standard", which was the average of videotape ratings by three physicians, each scoring independently. Analysis included both correlations with the standard and a repeated-measures analysis of variance (ANOVA) comparing raters' mean scores on each station with mean scores of the gold standard.
Ninety-one videotapes were scored by the "gold-standard" physicians. Correlations with the standard showed no clear preference for MD, SPO, or SPR raters. ANOVAs revealed significant differences from the standard on three stations for the SPR, two stations for the SPO, and one stations for the MD.
An MD rater is less likely to differ from a standard established by a consensus of MD ratings than are SP raters rating from recall. If an MD cannot be used, an SP observer is preferable to an SP rating from recall.
PubMed ID
8615935 View in PubMed
Less detail

A comprehensive examination for senior surgical residents.

https://arctichealth.org/en/permalink/ahliterature198481
Source
Am J Surg. 2000 Mar;179(3):190-3
Publication Type
Article
Date
Mar-2000
Author
H. MacRae
G. Regehr
W. Leadbetter
R K Reznick
Author Affiliation
Department of Surgery and Centre for Research in Education, Faculty of Medicine, University of Toronto, Toronto, Ontario, Canada.
Source
Am J Surg. 2000 Mar;179(3):190-3
Date
Mar-2000
Language
English
Publication Type
Article
Keywords
Clinical Competence - standards
Educational Measurement - methods - standards
Feasibility Studies
General Surgery - education
Humans
Internship and Residency - classification - standards
Ontario
Reproducibility of Results
Time Factors
Abstract
Two complimentary examinations designed to comprehensively assess competence for surgical practice have been developed. The Objective Structured Assessment of Technical Skill (OSATS) evaluates a resident's operative skill, and the Patient Assessment and Management Examination (PAME) evaluates clinical management skills.
Twenty-four postgraduate year (PGY)-4 and PGY-5 general surgery residents from four training programs were examined. Each examination had eight stations, with a total of 6 hours of testing time.
Interstation reliability for the OSATS was 0.64, for the PAME was 0.71, and for the total test was 0. 74. Examination scores discriminated between PGY-4 and PGY-5 residents for the OSATS (t = 4.39, P
PubMed ID
10827317 View in PubMed
Less detail

A computer-based trauma simulator for teaching trauma management skills.

https://arctichealth.org/en/permalink/ahliterature198480
Source
Am J Surg. 2000 Mar;179(3):223-8
Publication Type
Article
Date
Mar-2000
Author
M K Gilbart
C R Hutchison
M D Cusimano
G. Regehr
Author Affiliation
Department of Surgery, Centre for Research in Education, University of Toronto, Toronto, Ontario, Canada.
Source
Am J Surg. 2000 Mar;179(3):223-8
Date
Mar-2000
Language
English
Publication Type
Article
Keywords
Analysis of Variance
Clinical Clerkship
Clinical Competence
Computer simulation
Computer-Assisted Instruction
Educational Measurement
Humans
Judgment
Leadership
Manikins
Multiple Trauma - surgery
Ontario
Personal Satisfaction
Self Concept
Students, Medical
Teaching - methods
Transfer (Psychology)
Traumatology - education
Abstract
The management of multiply injured trauma patients is a skill requiring broad knowledge, sound judgment, and leadership capabilities. The purpose of this study was to evaluate the effectiveness of a computer-based trauma simulator as a teaching tool for senior medical students.
All year-4 clinical clerks at the University of Toronto were approached to participate in a focused, 2-hour trauma management course. The volunteer rate for the course was 79%. Students were randomized to either computer-based simulator or seminar-based teaching groups. Outcome measures in this study were students' trauma objective structured clinical examination (OSCE) scores.
Both the trauma simulator and seminar teaching groups performed significantly better than the comparison group (no additional teaching) on the trauma OSCE patient encounter component, but not the written component of the examination. There was no significant difference in the performances of the trauma simulator and seminar teaching groups. Students overwhelmingly felt the trauma simulator was effective for their trauma teaching, and improved their overall confidence in clinical trauma scenarios.
There is a significant benefit associated with a focused, clinically based trauma management course for senior medical students. No additional improvement was noted with the use of a high fidelity computer-based trauma simulator.
PubMed ID
10827325 View in PubMed
Less detail

The effect of early performance on examiners' marking patterns during an oral examination.

https://arctichealth.org/en/permalink/ahliterature213610
Source
Acad Med. 1996 Jan;71(1 Suppl):S73-5
Publication Type
Article
Date
Jan-1996

Evaluating surgical resident selection procedures.

https://arctichealth.org/en/permalink/ahliterature194550
Source
Am J Surg. 2001 Mar;181(3):221-5
Publication Type
Article
Date
Mar-2001
Author
M K Gilbart
M D Cusimano
G. Regehr
Author Affiliation
Department of Orthopaedic Surgery, University of Toronto, Toronto, ON, Canada.
Source
Am J Surg. 2001 Mar;181(3):221-5
Date
Mar-2001
Language
English
Publication Type
Article
Keywords
Canada
Data Interpretation, Statistical
Humans
Internship and Residency
Orthopedics - education
Personnel Selection - methods - standards
Reproducibility of Results
Abstract
The purposes of this study were to develop and assess a rating form for selection of surgical residents, determine the criteria most important in selection, determine the reliability of the assessment form and process both within and across sites, and document differences in procedure and structure of resident selection processes across Canada.
Twelve of 13 English-speaking orthopedic surgery training programs in Canada participated during the 1999 selection year. The critical incident technique was utilized to determine the criteria most important in selection. From these criteria a 10-item rating form was developed with each item on a 5-point scale. Sixty-six candidates were invited for interviews across the country. Each interviewer completed one assessment form for each candidate, and independently ranked all candidates at the conclusion of all interviews. Consensus final rank orders were then created for each residency program. Across all programs, pairwise program-by-program correlations for each assessment parameter were made.
The internal consistency of assessment form ratings for each interviewer was moderately high (mean Cronbach's alpha = 0.71). A correlation between each item and the final rank order for each program revealed that the items work ethic, interpersonal qualities, orthopedic experience, and enthusiasm correlated most highly with final candidate rank orders (r = 0.5, 0.48, 0.48, 0.45, respectively). The interrater reliabilities (within panels) and interpanel reliabilities (within programs) for the rank orders were 0.67 and 0.63, respectively. Using the Spearman-Brown prophecy formula, it was found that two panels with two interviewers on each panel are required to obtain a stable measure of a given candidate (reliabilities of 0.80). The average pairwise program-by-program correlations were low for the final candidate rank orders (0.14).
A method was introduced to develop a standard, reliable candidate assessment form to evaluate residency selection procedures. The assessment form ratings were found to be consistent within interviewers. Candidate assessments within programs (both between interviewers and between panels) were moderately reliable suggesting agreement within programs regarding the relative quality of candidates, but there was very little agreement across programs.
PubMed ID
11376575 View in PubMed
Less detail

Validation of an objective structured clinical examination in psychiatry.

https://arctichealth.org/en/permalink/ahliterature204595
Source
Acad Med. 1998 Aug;73(8):910-2
Publication Type
Article
Date
Aug-1998
Author
B. Hodges
G. Regehr
M. Hanson
N. McNaughton
Author Affiliation
Department of Psychiatry and Centre for Research in Education, Faculty of Medicine, University of Toronto, Ontario, Canada. brian.hodges@utoronto.ca
Source
Acad Med. 1998 Aug;73(8):910-2
Date
Aug-1998
Language
English
Publication Type
Article
Keywords
Clinical Clerkship
Clinical Competence
Humans
Internship and Residency
Mental Disorders - diagnosis
Ontario
Psychiatry - education
Reproducibility of Results
Abstract
To examine the validity of a psychiatry clerkship's objective structured clinical examination (OSCE).
In 1996, 33 clinical clerks and 17 psychiatry residents at the University of Toronto participated in an eight-station OSCE evaluated by psychiatrist-examiners using binary checklists and global ratings. Prior to the OSCE, communication course instructors were asked to rank the clerks on interviewing ability, and faculty supervisors were asked to identify the OSCE stations on which the clerks were likely to do well or poorly.
Mean OSCE scores were significantly higher for the residents than for the clerks on global ratings but not on checklists. The communication instructors accurately predicted the clerks' rankings on the global scores but not their scores on the checklists. The faculty supervisors predicted with moderate accuracy the clerks' success on the OSCE stations as measured by the checklists but not by the global ratings. The residents rated the OSCE scenarios as highly realistic.
The evidence of construct and concurrent validity together with high ratings of realism suggest that a psychiatry OSCE can be a valid assessment of clerks' clinical competence.
PubMed ID
9736854 View in PubMed
Less detail

Using operative outcome to assess technical skill.

https://arctichealth.org/en/permalink/ahliterature196576
Source
Am J Surg. 2000 Sep;180(3):234-7
Publication Type
Article
Date
Sep-2000
Author
D. Szalay
H. MacRae
G. Regehr
R. Reznick
Author Affiliation
Faculty of Medicine, University of Toronto, Center for Research in Education at the University Health Network, Toronto, Ontario, Canada.
Source
Am J Surg. 2000 Sep;180(3):234-7
Date
Sep-2000
Language
English
Publication Type
Article
Keywords
Benchmarking - standards
Clinical Competence - standards
Feasibility Studies
General Surgery - education
Humans
Internship and Residency - standards
Ontario
Reproducibility of Results
Abstract
This study examined whether an operative product and time to completion could serve as measures of technical skill.
Nine final-year (PGY5) and 11 penultimate-year (PGY4) general surgery residents participated in a 6-station bench model examination. Time to completion was recorded. Twelve faculty surgeons (2 per station) evaluated the quality of the final product using a 5-point scale.
The mean interrater reliability was 0. 59 for product quality. Interstation reliability was 0.59 for analysis of the final product and 0.72 for time to completion. There was 63% and 78% agreement between attendings' ratings and product quality and time scores respectively. PGY5s' mean product quality score was 4.14 +/- 0.26, compared with 3.82 +/- 0.33 for PGY4s (P
PubMed ID
11084137 View in PubMed
Less detail

14 records – page 1 of 2.