Technical Papers
May 14, 2014

Sustainable Engineering Assessment Using Rubric-Based Analysis of Challenge Question Responses

Publication: Journal of Professional Issues in Engineering Education and Practice
Volume 141, Issue 2

Abstract

A method to evaluate students’ application of sustainability principles into engineering design and problem solving was developed. Students were presented with two challenge questions that each posed a scenario. Students responded to one question and their answers were scored using a rubric that combined both analytic and holistic scoring criteria to account for sustainability principles. The rubric was iteratively developed and met standard measures of reliability and validity. Based on responses from primarily civil and environmental engineering students at three institutions, it was found that male students, seniors, and participants in many diverse learning activities (extracurricular service, internships, undergraduate research, and engineering design courses) achieved higher rubric scores. Scores on the challenge questions did not correlate to students’ interests in sustainable engineering, which were measured using a Likert-based survey. A challenge-question approach to formative assessment may enable instructors to adapt curricula to foster student learning of implicit sustainability issues in engineering. The rubric may be used with a diversity of content-focused challenge questions as a tool for assessing students’ holistic, sustainability-focused approaches to engineering.

Get full access to this article

View all available purchase options and get full access to this article.

Acknowledgments

This material is based on work supported by the National Science Foundation Engineering Education Program under Grant No. 0935082, 0935209, and 0934567. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.

References

ABET. (2012). “Criteria for accrediting engineering programs.” Baltimore, 〈www.abet.org〉 (Aug. 1, 2013).
Allen, D., et al. (2008). “Benchmarking sustainable engineering education.”, 〈http://syracusecoe.org/csengine/images/allmedia/BSEE_Final_Report_31Dec08_No_Appen_D.pdf〉 (April 25, 2014).
American Educational Research Association, American Psychological Association, and National Council on Measurement in Education. (1999). “Standards for educational and psychological testing.” American Educational Research Association, Washington, DC.
American Society for Engineering Education (ASEE). (1994). “Engineering education for a changing world.” 〈https://www.asee.org/member-resources/reports〉 (April 25, 2014).
Angelo, T. A., and Cross, K. P. (1993). Classroom assessment techniques: A handbook for college teachers, 2nd Ed., Jossey-Bass, San Francisco.
Association for the Advancement of Sustainability in Higher Education (AASHE). (2013). “2012 Higher education sustainability review.” Denver.
Association of American Colleges, and Universities (AACU). (2009). “Learning and assessment: Trends in undergraduate education.” 〈https://www.aacu.org/membership/documents/2009MemberSurvey _Part1.pdf〉 (April 25, 2014).
Benesch, S. (1993). “Critical thinking: A learning process for democracy.” TESOL Q., 27(3), 545–547.
Besterfield-Sacre, M., Gerchak, J., Lyons, M., and Wolfe, H. (2004). “Scoring concept maps: An integrated rubric for assessing engineering education.” J. Eng. Edu., 93(2), 105–115.
Bilec, M. M., Hendrickson, C., Landis, A. E., and Matthews, H. S. (2011). “Updating the benchmark sustainable engineering education report: Trends from 2005 to 2010.” Paper 2011-685, Proc., American Society for Engineering Education (ASEE) Annual Conf. and Exposition, Washington, DC, 7.
Birenbaum, M. (1996). “Assessment 2000: Towards a pluralistic approach to assessment.” Alternatives in assessment of achievements, learning processes prior knowledge, M. Birenbaum and F. J. R. C. Dochy, eds., Kluwer, Boston, 3–30.
Brookhart, S. M. (1999). “The art and science of classroom assessment: The missing part of pedagogy.”, George Washington Univ., Graduate School of Education and Human Development, Washington, DC.
Cohen, J. (1960). “A coefficient of agreement for nominal scales.” Edu. Psychol. Meas., 20(1), 37–46.
Cronbach, L. J. (1971). “Test validation.” Educational measurement, 2nd Ed., R. L. Thorndike, ed., American Council on Education, Washington, DC.
De Vries, B., and Petersen, A. (2009). “Conceptualizing sustainable development: An assessment methodology connecting values, knowledge, worldviews, and scenarios.” Ecol. Econ., 68(4), 1006–1019.
Dochy, F. (2001). “A new assessment era: Different needs, new challenges.” Learn. Instruct., 10(1), 11–20.
Elder, J. L. (2008). “Think systemically, act cooperatively: The key to reaching a tipping point for the sustainability movement in higher education.” Sustain.: J. Rec., 1(5), 319–328.
English, F. W. (1992). Deciding what to teach and test, Sage, Newbury Park, CA.
Facione, P. (1990). “Critical thinking: A statement of expert consensus for purposes of educational assessment and instruction, executive summary, The Delphi Report.” The complete American Philosophical Association Delphi research report, California Academic, Millbrae, CA.
Facione, P., and Facione, N. (1994). Holistic critical thinking scoring rubric, California Academic, Millbrae, CA, 〈https://www.temple.edu/tlc/resources/handouts/grading/Holistic%20Critical%20Thinking%20Scoring%20Rubric.v2.pdf〉 (April 25, 2014).
Feltovich, P. J., Spiro, R. J., and Coulson, R. L. (1993). “Learning, teaching, and testing for complex conceptual understanding.” Test theory for a new generation of tests, N. Frederiksen, R. J. Mislevy and I. I. Behar, eds., Lawrence Erlbaum, Hillsdale, NJ.
Gerchak, J., Besterfield-Sacre, M., Shuman, L. J., and Wolfe, H. (2004). “Using concept maps for evaluation program objectives.” Proc., 33rd Annual Frontiers in Engineering Education (FIE) Conf., Boulder, CO.
Glaser, B. G., and Strauss, A. L. (1967). The discovery of grounded theory: Strategies for qualitative research, Aldine de Gruyter, New York.
Hart Research Associates. (2013). “It takes more than a major: Employer priorities for college learning and student success.” 〈http://www.aacu.org/liberaleducation/le-sp13/hartresearchassociates.cfm〉 (Apr. 25, 2014).
Hoffmann, S. R., Pawley, A. L., Rao, R., Cardella, M. E., and Ohland, M. W. (2011). “Defining “sustainable engineering”: A comparative analysis of published sustainability principles and existing courses.” Paper AC 2011-2178, Proc., American Society for Engineering Education (ASEE) Annual Conf. and Exposition, Washington, DC.
Jonsson, A., and Svingby, G. (2007). “The use of scoring rubrics: Reliability, validity and educational consequences.” Edu. Res. Rev., 2(2), 130–144.
Lemke, J. L. (1990). Talking science: Language, learning, and values, Ablex, Norwood, NJ.
Leydens, J., Moskal, B., and Pavelich, M. (2004). “Qualitative methods used in assessment of engineering education.” J. Eng. Edu., 93(1), 65–73.
Magone, M. E., Cai, J., Silver, E. A., and Want, N. (1994). “Validating the cognitive complexity and content quality of a mathematics performance assessment.” Int. J. Edu. Res., 21(3), 317–340.
McCormick, M., Bielefeldt, A., Swan, C., and Paterson, K. (2014). “Assessing students’ motivation to engage in sustainable engineering.” Int. J. Sustain. Higher Edu., in press.
McKeown, R. (2011). “Using rubrics to assess student knowledge related to sustainability: A practitioner’s view.” J. Edu. Sustain. Dev., 5(1), 61–74.
Moskal, B. M. (2000). “Scoring rubrics: What, when and how?” 〈http://pareonline.net/getvn.asp?v=7&n=3〉 (April 25, 2014).
Moskal, B. M., and Leydens, J. A. (2000). “Scoring rubric development: Validity and reliability.” Pract. Assess. Res. Eval., 7(10), 71–81.
National Academy of Engineering (NAE). (2005). Educating the engineer of 2020: Adapting engineering education to the new century, National Academy, Washington, DC.
National Research Council Board’s Council on Engineering Education. (1995). Engineering education: Designing an adaptive system, National Academy Press, Washington, DC.
Oberst, B. S., and Jones, R. C. (2003). “Megatrends in engineering education.” Proc., American Society for Engineering Education Conf., Washington, DC.
Paterson, K. G., and Fuchs, V. J. (2008). “Development for the other 80%: Engineering hope.” J. Aust. Eng. Edu., 14(1), 1–12.
Popham, W. J. (1997). “What’s wrong—and what’s right—with rubrics.” Edu. Leadership, 55(2), 72–75.
Prus, J., and Johnson, R. (1994). “Assessment and testing myths and realities.” New directions for community colleges, No. 88., Jossey-Bass Inc., San Francisco, CA.
Randall, D. M., and Fernandes, M. F. (1991). “The social desirability response bias in ethics research.” J. Bus. Ethics, 10(11), 805–817.
Riley, D. R., Grommes, A. V., and Thatcher, C. E. (2007). “Teaching sustainability in building design and engineering.” J. Green Build., 2(1), 175–195.
Roxas, B., and Lindsay, V. (2012). “Social desirability bias in survey research on sustainable development in small firms: An exploratory analysis of survey mode effect.” Bus. Strategy Environ., 21(4), 223–235.
Schön, D. A. (1983). The reflective practitioner: How professionals think in action, Basic Books, New York.
Seger, M. S. R. (1997). “An alternative for assessing problem-solving skills: The overall test.” Stud. Edu. Eval., 23(4), 373–398.
Shuman, L., et al. (2004). “Can our students recognize and resolve ethical dilemmas?” Proc., 2004 American Society for Engineering Education Annual Conf., American Society for Engineering Education, Washington, DC.
Sindelar, M., et al. (2003). “Assessing engineering students’ abilities to resolve ethical dilemmas.” Proc., 33rd ASEE/IEEE Frontiers in Education Conf., IEEE, New York, NY.
Splitt, F. G. (2003). “The challenge to change: On realizing the new paradigm for engineering education.” J. Eng. Edu., 92(2), 181–187.
Stassen, M. L. A., Herrington, A., and Henderson, L. (2011). “Defining critical thinking in higher education.” Chapter 10, To improve the academy, J. Miller, ed., Vol. 30, Jossey-Bass, San Francisco, CA, 126–141.
Stemler, S. E. (2004). “A comparison of consensus, consistency, and measurement approaches to estimating interrater reliability.” Pract. Assess. Res. Eval., 9(4), 1–19.
Turns, J., Atman, C., and Adams, R. (2000). “Concept maps for engineering education: A cognitively motivated tool supporting varied assessment functions.” IEEE Trans. Edu., 43(2), 164–173.
UNESCO. (2002). “Education for sustainability from Rio to Johannesburg: Lessons learnt from a decade of commitment.” Paris, France.
United Nations (UN). (2001). “Indicators of sustainable development: Framework and methodologies.” United Nations Dept. of Economic and Social Affairs, Division for Sustainable Development, New York, NY.
Van Vaerenbergh, Y., and Thomas, T. D. (2013). “Response styles in survey research: A literature review of antecedents, consequences, and remedies.” Int. J. Public Opin. Res., 25(2), 195–217.
Walker, J., and King, P. (2003). “Concept mapping as a form of student assessment and instruction in the domain of bioengineering.” J. Eng. Edu., 92(2), 167–177.
Watson, M. K., Barella, E. M., Wall, T. A., Noyes, C. R., and Rodgers, M. O. (2013). “Development and application of a sustainable design rubric to evaluate student abilities to incorporate sustainability into capstone design projects.” Proc., ASEE Annual Conf., ASEE, Washington, DC.
Welkenhuysen-Gybels, J., Billiert, J., and Cambre, B. (2003). “Adjustment of acquiescence in the assessment of the construct equivalence of Likert-type score items.” J. Cross-Cultural Psychol., 34(6), 702–722.
World Commission on Environment and Development. (1987). “Our common future.” Brundtland Rep., Oxford Univ. Press, Oxford, 43.
Zelik, M. (2004). “Classroom assessment technique: Concept mapping.” Field–tested learning assessment guide (FLAG), National Institute for Science Education, Madison, WI, 〈http://www.flaguide.org/cat/conmap/conmap1.php〉, (April 25, 2014).

Information & Authors

Information

Published In

Go to Journal of Professional Issues in Engineering Education and Practice
Journal of Professional Issues in Engineering Education and Practice
Volume 141Issue 2April 2015

History

Received: Dec 18, 2013
Accepted: Apr 3, 2014
Published online: May 14, 2014
Discussion open until: Oct 14, 2014
Published in print: Apr 1, 2015

Permissions

Request permissions for this article.

Authors

Affiliations

Mary McCormick
Ph.D. Candidate, Science, Technology, Engineering, and Mathematics (STEM) Education, Tufts Univ., Medford, MA 02155.
Kristina Lawyer
Instructor, Automotive Engineering Technology, College of Technology, Indiana State Univ., Terre Haute, IN 47809-9989.
Jonathan Wiggins
Research Assistant, Univ. of Colorado, Boulder, CO 80309.
Christopher W. Swan, A.M.ASCE
Associate Dean for Undergraduate Curriculum Development, School of Engineering, Tufts Univ., 113 Anderson Hall, 200 College Ave., Medford, MA 02155.
Kurtis G. Paterson
P.E.
Dept. Head, Engineering, James Madison Univ., MSC 4116, 701 Carrier Dr., Harrisonburg, VA 22807.
Angela R. Bielefeldt, M.ASCE [email protected]
P.E.
Professor, Univ. of Colorado Boulder, 428 UCB, Boulder, CO 80309 (corresponding author). E-mail: [email protected]

Metrics & Citations

Metrics

Citations

Download citation

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

Cited by

View Options

Get Access

Access content

Please select your options to get access

Log in/Register Log in via your institution (Shibboleth)
ASCE Members: Please log in to see member pricing

Purchase

Save for later Information on ASCE Library Cards
ASCE Library Cards let you download journal articles, proceedings papers, and available book chapters across the entire ASCE Library platform. ASCE Library Cards remain active for 24 months or until all downloads are used. Note: This content will be debited as one download at time of checkout.

Terms of Use: ASCE Library Cards are for individual, personal use only. Reselling, republishing, or forwarding the materials to libraries or reading rooms is prohibited.
ASCE Library Card (5 downloads)
$105.00
Add to cart
ASCE Library Card (20 downloads)
$280.00
Add to cart
Buy Single Article
$35.00
Add to cart

Get Access

Access content

Please select your options to get access

Log in/Register Log in via your institution (Shibboleth)
ASCE Members: Please log in to see member pricing

Purchase

Save for later Information on ASCE Library Cards
ASCE Library Cards let you download journal articles, proceedings papers, and available book chapters across the entire ASCE Library platform. ASCE Library Cards remain active for 24 months or until all downloads are used. Note: This content will be debited as one download at time of checkout.

Terms of Use: ASCE Library Cards are for individual, personal use only. Reselling, republishing, or forwarding the materials to libraries or reading rooms is prohibited.
ASCE Library Card (5 downloads)
$105.00
Add to cart
ASCE Library Card (20 downloads)
$280.00
Add to cart
Buy Single Article
$35.00
Add to cart

Media

Figures

Other

Tables

Share

Share

Copy the content Link

Share with email

Email a colleague

Share