TECHNICAL PAPERS
Oct 1, 2007

Reliability and Validity of FE Exam Scores for Assessment of Individual Competence, Program Accreditation, and College Performance

Publication: Journal of Professional Issues in Engineering Education and Practice
Volume 133, Issue 4

Abstract

This study explores whether fundamentals of engineering (FE) exam scores are reliable and valid measures of individual competence, program accreditation, and college performance, each of these being processes commonly assessed using FE scores. Findings indicate that a trend exists toward erosion of reliability and validity as one moves further from the individual assessment level. That is, FE exam scores are probably reliable and valid indicators of minimal technical competency at the individual level. However, program-level assessments require a careful, fine-grained comparison of the subject-content statistics reported by NCEES relative to stated program objectives, and in certain cases, FE exam data will not serve as reliable and valid assessment indicators for some engineering programs. Further, assessment of entire engineering schools based on college-wide FE exam pass rates is inappropriate and cannot be rationally supported.

Get full access to this article

View all available purchase options and get full access to this article.

Acknowledgments

The writer thanks Walt LeFevre, Jimmy Smith, and several anonymous peer reviewers for their critical review and constructive comments on earlier drafts of this paper.

References

Accreditation Board for Engineering and Technology (ABET). (1998). “Engineering criteria 2000.” How do you measure success? Designing effective processes for assessing engineering education, American Society for Engineering Education, Washington, D.C., 13–16.
American Bar Association. (2005). ⟨http://www.abanet.org/⟩ (June 14, 2005).
American Medical Association. (2005). ⟨http://www.ama-assn.org/⟩ (June 14, 2005).
ASCE. (2005). “The purpose of the fundamentals of engineering examination.” ASCE Policy Statement 369, ⟨http://www.asce.org/pressroom/news/policy_details.cfm?hdlid=96⟩ (June 14, 2005).
Bakos, J. D., Jr. (1996). “Direct outcome-based assessment measures.” J. Profl. Issues Eng. Educ. Pract., 122(1), 31–34.
Bakos, J. D., Jr. (1999). “Outcomes assessment: Sharing responsibilities.” J. Profl. Issues Eng. Educ. Pract., 125(3), 108–111.
Brigham, S. (1994). “Introduction.” CQI 101: A first reader for higher education, AAHE Continuous Quality Improvement Project, American Association of Higher Education, Washington, D.C.
Broadfoot, P. M. (1996). Education, assessment and society: A sociological analysis, Open University Press, Buckingham, U.K.
Brookhart, S. M. (1999). “The art and science of classroom assessment: The missing part of pedagogy.” ASHE-ERIC Higher Assessment Rep. 27, No. 1, The George Washington Univ., Washington, D.C.
Callan, P. M., Doyle, W., and Finney, J. E. (2001). “Evaluating state higher education performance: Measuring up 2000.” Change, 33(2), 10–19.
Curry, L., and Wergin, J. F. (1993). “Professional education.” Handbook of the undergraduate curriculum: A comprehensive guide to purposes, structures, practices, and change, Jerry G. Gaff and James L. Ratcliff, eds., Jossey-Bass Publishers, San Francisco, 1997.
Diamond, R. M. (1998). Designing and assessing courses & curricula, A practical guide, Jossey-Bass Publishers, San Francisco.
Eaton, J. S. (2001). “Regional accreditation reform: Who is served?” Change, 33(2), 38–45.
Education Commission of the States. (1995). “Making quality count in undergraduate education.” Rep. PS-95-1 for the ECS Chairman’s “Quality Counts” Agenda in Higher Education, Denver.
Educational Testing Service. (2005). ⟨http://www.ets.org/⟩ (June 14, 2005).
Ewell, P. T. (1993). “Total quality and academic practice: The idea we’ve been waiting for.” Change, 25(3), 49–55.
Ewell, P. T. (2001). “Statewide testing in higher education.” Change, 33(2), 21–27.
Ferrier, M. B. (1994). “In search of effective quality assessment.” ASEE Prism, 4(1), 23–25.
Ford, G., and Gibbs, N. E. (1996). “A mature profession of software engineering.” Technical Rep. CMU/SEI-96-TR-004, Software Engineering Institute, Carnegie Mellon Univ., ⟨http://www.sei.cmu.edu/pub/documents/96.reports/pdf/tr004.96.pdf⟩ (June 14, 2005).
Gaither, G. H., ed. (1995). “Assessing performance in an age of accountability: Case studies.” New directions for higher education, Vol. 91, Jossey-Bass Publishers, San Francisco.
Hall, B. L. (1994). “State of the art: An examination and classification of existing methods of student outcomes assessment in American higher education.” D.E. dissertation, Texas Tech Univ., Lubback, Tex.
Halpern, D. F., ed. (1987). “Student outcomes assessment: Introduction and overview.” Student outcomes assessment: What institutions stand to gain, New Directions for Higher Education, No. 59, Vol. XV, Number 3, Jossey-Bass, Inc., San Francisco.
Joint Task Force on Engineering Education Assessment. (1996). “A general assessment framework.” How do you measure success? Designing effective processes for assessing engineering education, American Society for Engineering Education, Washington, D.C., 17–26.
LeFevre, W., Smith, J. W., Steadman, J. W., and White, K. R. (1999). “Using the fundamentals of engineering (FE) examination to assess academic programs.” National Council of Examiners for Engineering and Surveying, Clemson, S.C.
Linn, R. L. and Gronlund, N. E. (1995). Measurement and assessment in teaching, 7th Ed., Pearson Education, Inc., Upper Saddle River, N.J., 48.
Lubinescu, E. S., Ratcliff, J. L., and Gaffney, M. A., eds. (2001). “Two continuums collide: Accreditation and assessment.” How accreditation influences assessment, New Directions for Higher Education, No. 113, Jossey-Bass Publishers, San Francisco.
Mazurek, D. F. (1995). “Consideration of FE exam for program assessment.” J. Profl. Issues Eng. Educ. Pract. 121(4), 247–249.
McMillan, J. H. (2001). Essential assessment concepts for teachers and administrators, Corwin Press, Thousand Oaks, Calif.
National Board of Medical Examiners (NBME). (2005), “Scores and transcripts.” United States medical licensing exam., ⟨http://www.usmle.org/scores/scores.htm⟩ (June 14, 2005).
National Conference of Bar Examiners (NCBE). (2005), “2004 statistics.” Bar admission statistics, ⟨http://www.ncbex.org/pubs/pdf/740205/2004statistics.pdf⟩ (June 14, 2005).
National Council of Examiners for Engineering and Surveying (NCEES). (2003). Rep. of the Engineering Licensure Qualifications Task Force, NCEES, Clemson, S.C., ⟨http://www.ncees.org/introduction/aboutn_cees/2003_elqtf_report.pdf⟩.
National Council of Examiners for Engineering and Surveying (NCEES). (2005). ⟨http://www.ncees.org⟩ (June 14, 2005).
Neal, J. E. (1995). “Overview of policy and practice: Differences and similarities in developing higher education accountability.” Assessing performance in an age of accountability: Case studies, G. H. Gaither, ed., New Directions for Higher Education, No. 91, Jossey-Bass Publishers, San Francisco.
Nirmalakhandan, N., and White, K. (2000). “Course refinement through outcomes assessment: A case study.” J. Profl. Issues Eng. Educ. Pract., 126(1), 27–31.
Oberle, J. (1990). “Quality gurus: The men and their message.” Training Magazine, Lakewood Publications, January, 47–52.
Palomba, C. A., and Banta, T. W. (1999). Assessment essentials: Planning, implementing, and improving assessment in higher education, Jossey-Bass Publishers, San Francisco.
Resnick, D. P. (1987). “Expansion, quality and testing in American education,” Issues in student assessment, D. Bray and M. J. Belcher eds., New Directions for Community Colleges, No. 59, Vol. XV, Number 3, Jossey-Bass, Inc., San Francisco.
Ruppert, S. S. (1995). “Roots and realities of state-level performance indicator systems.” Assessing performance in an age of accountability: Case studies, G. H. Gaither ed., New Directions for Higher Education, No. 91, Jossey-Bass Publishers, San Francisco.
Schwartz, A. (2001). “Time to take a fresh look at the engineering profession.” Eng. Times, 23(5), 5.
Senge, P. M. (1990). “The leader’s new work: Building learning organizations.” Sloan Manage. Rev., 32(1), 7–23.
Van Dyke, J., Rudolph, L. B., and Bower, K. A. (1993). “Performance funding.” Making a difference: Outcomes of a decade of assessment in higher education, T. W. Banta, ed., Jossey-Bass, Inc., San Francisco, 283–293.
Wellman, J. V. (2001). “Assessing state accountability systems,” Change, 33(2), 47–52.
Wicker, R. B., Quintana, R., and Tarquin, A. (1999). “Evaluation model using fundamentals of engineering examination.” J. Profl. Issues Eng. Educ. Pract., 125(2), 47–55.
Withington, J. P. (1999). “Short cycle assessment techniques for ABET criteria 2000 compliance.” Proc., ASEE/IEEE Frontiers in Education Conf., San Juan, Puerto Rico.

Information & Authors

Information

Published In

Go to Journal of Professional Issues in Engineering Education and Practice
Journal of Professional Issues in Engineering Education and Practice
Volume 133Issue 4October 2007
Pages: 320 - 326

History

Received: Aug 18, 2005
Accepted: Nov 1, 2005
Published online: Oct 1, 2007
Published in print: Oct 2007

Permissions

Request permissions for this article.

Authors

Affiliations

William D. Lawson, M.ASCE [email protected]
P.E.
Deputy Director, National Institute for Engineering Ethics, Texas Tech Univ., Box 41023, Lubbock, TX 79409-1023. E-mail: [email protected]

Metrics & Citations

Metrics

Citations

Download citation

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

Cited by

View Options

Get Access

Access content

Please select your options to get access

Log in/Register Log in via your institution (Shibboleth)
ASCE Members: Please log in to see member pricing

Purchase

Save for later Information on ASCE Library Cards
ASCE Library Cards let you download journal articles, proceedings papers, and available book chapters across the entire ASCE Library platform. ASCE Library Cards remain active for 24 months or until all downloads are used. Note: This content will be debited as one download at time of checkout.

Terms of Use: ASCE Library Cards are for individual, personal use only. Reselling, republishing, or forwarding the materials to libraries or reading rooms is prohibited.
ASCE Library Card (5 downloads)
$105.00
Add to cart
ASCE Library Card (20 downloads)
$280.00
Add to cart
Buy Single Article
$35.00
Add to cart

Get Access

Access content

Please select your options to get access

Log in/Register Log in via your institution (Shibboleth)
ASCE Members: Please log in to see member pricing

Purchase

Save for later Information on ASCE Library Cards
ASCE Library Cards let you download journal articles, proceedings papers, and available book chapters across the entire ASCE Library platform. ASCE Library Cards remain active for 24 months or until all downloads are used. Note: This content will be debited as one download at time of checkout.

Terms of Use: ASCE Library Cards are for individual, personal use only. Reselling, republishing, or forwarding the materials to libraries or reading rooms is prohibited.
ASCE Library Card (5 downloads)
$105.00
Add to cart
ASCE Library Card (20 downloads)
$280.00
Add to cart
Buy Single Article
$35.00
Add to cart

Media

Figures

Other

Tables

Share

Share

Copy the content Link

Share with email

Email a colleague

Share