A Systematic Literature Review of Student Assessment Framework in Software Engineering Courses
Downloads
Background: Software engineering are courses comprising various project types, including simple assignments completed in supervised settings and more complex tasks undertaken independently by students, without the oversight of a constant teacher or lab assistant. The imperative need arises for a comprehensive assessment framework to validate the fulfillment of learning objectives and facilitate the measurement of student outcomes, particularly in computer science and software engineering. This leads to the delineation of an appropriate assessment structure and pattern.
Objective: This study aimed to acquire the expertise required for assessing student performance in computer science and software engineering courses.
Methods: A comprehensive literature review spanning from 2012 to October 2021 was conducted, resulting in the identification of 20 papers addressing the assessment framework in software engineering and computer science courses. Specific inclusion and exclusion criteria were meticulously applied in two rounds of assessment to identify the most pertinent studies for this investigation.
Results: The results showed multiple methods for assessing software engineering and computer science courses, including the Assessment Matrix, Automatic Assessment, CDIO, Cooperative Thinking, formative and summative assessment, Game, Generative Learning Robot, NIMSAD, SECAT, Self-assessment and Peer-assessment, SonarQube Tools, WRENCH, and SEP-CyLE.
Conclusion: The evaluation framework for software engineering and computer science courses required further refinement, ultimately leading to the selection of the most suitable technique, known as learning framework.
Keywords: Computer science course, Software engineering course, Student assessment, Systematic literature review
M. Murtiningsih, M. Kristiawan, and B. Lian, "The correlation between supervision of headmaster and interpersonal communication with work ethos of the teacher,” Eur. J. Educ. Stud., 2019.
E. Balbachevsky, H. Sampaio, and C. Y. de Andrade, "Expanding access to higher education and its (limited) consequences for social inclusion: The Brazilian experience,” Soc. Incl., vol. 7, no. 1, pp. 7–17, 2019.
H. A. M. A. M. Abdeljaber, S. Ahmad, and A. Sultan, "Program outcomes assessment method for multi-academic accreditation bodies: Computer science program as a case study,” Int. J. Emerg. Technol. Learn., vol. 12, no. 5, p. 23, 2017, doi: 10.3991/ijet.v12i05.6410.
A. Shafi, S. Saeed, Y. A. Bamarouf, S. Z. Iqbal, N. Min-Allah, and M. A. Alqahtani, "Student outcomes assessment methodology for ABET accreditation: A case study of computer science and computer information systems programs,” IEEE Access, vol. 7, pp. 13653–13667, 2019, doi: 10.1109/ACCESS.2019.2894066.
P. Peretti"Watel, J. K. Ward, C. Vergelys, A. Bocquier, J. Raude, and P. Verger, "‘I think I made the right decision... I hope I'm not wrong'. Vaccine hesitancy, commitment and trust among parents of young children,” Sociol. Health Illn., vol. 41, no. 6, pp. 1192–1206, 2019.
S. Fulton and D. Schweitzer, "Impact of Giving Students a Choice of Homework Assignments in an Introductory Computer Science Class.,” Int. J. Scholarsh. Teach. Learn., vol. 5, no. 1, p. n1, 2011.
R. Fojtík, "Problems of distance education,” Icte J., vol. 7, no. 1, pp. 14–23, 2018.
B. Prevalla and H. Uzunboylu, "Flipped learning in engineering education,” TEM J., vol. 8, no. 2, p. 656, 2019.
F. M. Newmann, "Higher order thinking in teaching social studies: A rationale for the assessment of classroom thoughtfulness,” J. Curric. Stud., vol. 22, no. 1, pp. 41–56, 1990.
I. Chirikov, T. Semenova, N. Maloshonok, E. Bettinger, and R. F. Kizilcec, "Online education platforms scale college STEM instruction with equivalent learning outcomes at lower cost,” Sci. Adv., vol. 6, no. 15, p. eaay5324, 2020.
E. Care, H. Kim, A. Vista, and K. Anderson, "Education System Alignment for 21st Century Skills: Focus on Assessment.,” Cent. Univers. Educ. Brookings Inst., 2018.
L. Fritz, "Effective Assessment for Early Courses in Computer Science: Instruments Other than Out-of-Class Programming Assignments.,” J. Instr. Res., vol. 8, no. 2, pp. 118–121, 2019, doi: 10.9743/jir.2019.8.2.17.
V. Kioupi and N. Voulvoulis, "Education for sustainable development: A systemic framework for connecting the SDGs to educational outcomes,” Sustainability, vol. 11, no. 21, p. 6104, 2019.
A. Gacs, S. Goertler, and S. Spasova, "Planned online language education versus crisis"prompted online language teaching: Lessons for the future,” Foreign Lang. Ann., vol. 53, no. 2, pp. 380–392, 2020.
F. Martin, A. Ritzhaupt, S. Kumar, and K. Budhrani, "Award-winning faculty online teaching practices: Course design, assessment and evaluation, and facilitation,” Internet High. Educ., vol. 42, pp. 34–43, 2019.
R. E. Tractenberg, "The assessment evaluation rubric: Promoting learning and learner-centered teaching through assessment in face-to-face or distanced higher education,” Educ. Sci., vol. 11, no. 8, p. 441, 2021.
C. Culver, "Learning as a peer assessor: evaluating peer-assessment strategies,” Assess. Eval. High. Educ., pp. 1–17, 2022.
T. Y. Pang, A. Kootsookos, K. Fox, and E. Pirogova, "Does an Assessment Rubric Provide a Better Learning Experience for Undergraduates in Developing Transferable Skills?.,” J. Univ. Teach. Learn. Pract., vol. 19, no. 3, p. 3, 2022.
Z. Beasley, A. Friedman, L. Pieg, and P. Rosen, "Leveraging peer feedback to improve visualization education,” in 2020 IEEE Pacific Visualization Symposium (PacificVis), 2020, pp. 146–155.
D. Chang, G.-J. Hwang, S.-C. Chang, and S.-Y. Wang, "Promoting students' cross-disciplinary performance and higher order thinking: A peer assessment-facilitated STEM approach in a mathematics course,” Educ. Technol. Res. Dev., vol. 69, pp. 3281–3306, 2021.
F. A. K. A. Salem, I. W. Damaj, L. A. Hamandi, and R. N. Zantout, "Effective Assessment of Computer Science Capstone Projects and Student Outcomes.,” Int. J. Eng. Pedagog., vol. 10, no. 2, pp. 72–93, 2020, doi: 10.3991/ijep.v10i2.11855.
C. Allen and D. M. A. Mehler, "Open science challenges, benefits and tips in early career and beyond,” PLoS Biol., vol. 17, no. 5, p. e3000246, 2019.
M. Ianniello, S. Iacuzzi, P. Fedele, and L. Brusati, "Obstacles and solutions on the ladder of citizen participation: a systematic review,” Public Manag. Rev., vol. 21, no. 1, pp. 21–46, 2019.
S. Areekkuzhiyil, "Issues and Concerns in Classroom Assessment Practices.,” Online Submiss., 2021.
A. M. Moreno, M.-I. Sanchez-Segura, F. Medina-Dominguez, and L. Carvajal, "Balancing software engineering education and industrial needs,” J. Syst. Softw., vol. 85, no. 7, pp. 1607–1620, 2012.
V. Garousi, G. Giray, E. Tuzun, C. Catal, and M. Felderer, "Closing the gap between software engineering education and industrial needs,” IEEE Softw., vol. 37, no. 2, pp. 68–77, 2019.
R. Fauzan, D. O. Siahaan, S. Rochimah, and E. Triandini, "A Novel Approach to Automated Behavioral Diagram Assessment using Label Similarity and Subgraph Edit Distance,” Comput. Sci., vol. 22, no. 2, pp. 191–207, 2021.
R. Fauzan, D. Siahaan, S. Rochimah, and E. Triandini, "A Different Approach on Automated Use Case Diagram Semantic Assessment,” Int. J. Intell. Eng. Syst., vol. 14, no. 1, pp. 496–505, Feb. 2021.
A. Vihavainen, J. Airaksinen, and C. Watson, "A systematic review of approaches for teaching introductory programming and their influence on success,” in Proceedings of the tenth annual conference on International computing education research, 2014, pp. 19–26. doi: 10.1145/2632320.2632349.
R. Garcia, K. Falkner, and R. Vivian, "Systematic literature review: Self-Regulated Learning strategies using e-learning tools for Computer Science,” Comput. Educ., vol. 123, no. April, pp. 150–163, 2018, doi: 10.1016/j.compedu.2018.05.006.
A. Luxton-Reilly et al., "Introductory programming: a systematic literature review,” in Proceedings companion of the 23rd annual ACM conference on innovation and technology in computer science education, 2018, pp. 55–106. doi: 10.1145/3293881.3295779.
H. Keuning, J. Jeuring, and B. Heeren, "A systematic literature review of automated feedback generation for programming exercises,” ACM Trans. Comput. Educ., vol. 19, no. 1, 2018, doi: 10.1145/3231711.
M. Lorås, G. Sindre, H. Trí¦tteberg, and T. Aalberg, "Study behavior in computing education”a systematic literature review,” ACM Trans. Comput. Educ., vol. 22, no. 1, pp. 1–40, 2021.
B. Kitchenham, O. P. Brereton, D. Budgen, M. Turner, J. Bailey, and S. Linkman, "Systematic literature reviews in software engineering–a systematic literature review,” Inf. Softw. Technol., vol. 51, no. 1, pp. 7–15, 2009.
B. Kitchenham et al., "Systematic literature reviews in software engineering-A tertiary study,” Inf. Softw. Technol., vol. 52, no. 8, pp. 792–805, 2010, doi: 10.1016/j.infsof.2010.03.006.
I. Inayat, S. S. Salim, S. Marczak, M. Daneva, and S. Shamshirband, "A systematic literature review on agile requirements engineering practices and challenges,” Comput. Human Behav., vol. 51, pp. 915–929, 2015, doi: 10.1016/j.chb.2014.10.046.
E. Triandini, R. Fauzan, D. O. Siahaan, S. Rochimah, I. G. Suardika, and D. Karolita, "Software similarity measurements using UML diagrams: A systematic literature review,” Regist. J. Ilm. Teknol. Sist. Inf., vol. 8, no. 1, p. 10, 2021, doi: 10.26594/register.v8i1.2248.
G. H. Guyatt et al., "Users' guides to the medical literature: XXV. Evidence-based medicine: principles for applying the users' guides to patient care,” Jama, vol. 284, no. 10, pp. 1290–1296, 2000.
A. Zeid, "A framework to evaluate software engineering student contests: Evaluation and integration with academic programs,” in 2013 35th International Conference on Software Engineering (ICSE), 2013, pp. 1083–1089.
I. Traverso-Ribon, A. Balderas-Alberico, J.-M. Dodero, I. Ruiz-Rube, and M. Palomo-Duarte, "Open data framework for sustainable assessment of project-based learning experiences,” Program, vol. 50, no. 4, pp. 380–398, 2016.
A. C. Siochi and W. R. Hardy, "WebWolf: Towards a simple framework for automated assessment of webpage assignments in an introductory web programming class,” in Proceedings of the 46th ACM Technical Symposium on Computer Science Education, 2015, pp. 84–89.
T. Reischmann and H. Kuchen, "Towards an E-assessment tool for advanced software engineering skills,” in Proceedings of the 16th Koli Calling International Conference on Computing Education Research, 2016, pp. 81–90.
S. Zougari, M. Tanana, and A. Lyhyaoui, "Towards an automatic assessment system in introductory programming courses,” in 2016 International Conference on Electrical and Information Technologies (ICEIT), 2016, pp. 496–499.
J. A. Sánchez et al., "Cloud service as the driver for university's software engineering programs digital transformation,” Procedia Comput. Sci., vol. 149, pp. 215–222, 2019.
G. Polito, M. Temperini, and A. Sterbini, "2tsw: Automated assessment of computer programming assignments, in a gamified web based system,” in 2019 18th International Conference on Information Technology Based Higher Education and Training (ITHET), 2019, pp. 1–9.
V.-A. Valavosiki, E. Stiakakis, and A. Chatzigeorgiou, "Development of a Framework for the Assessment of Soft Skills in the ICT Sector,” in Operational Research in the Digital Era–ICT Challenges: 6th International Symposium and 28th National Conference on Operational Research, Thessaloniki, Greece, June 2017, 2019, pp. 105–123.
S. Rouvrais and C. Lassudrie, "An assessment framework for engineering education systems,” in Software Process Improvement and Capability Determination: 14th International Conference, SPICE 2014, Vilnius, Lithuania, November 4-6, 2014, Proceedings 14, 2014, pp. 250–255.
P. Ciancarini, M. Missiroli, and D. Russo, "Cooperative Thinking: Analyzing a new framework for software engineering education,” J. Syst. Softw., vol. 157, p. 110401, 2019.
S. Bansal, A. Bansal, and O. Dalrymple, "Outcome-based Education model for computer science Education,” J. Eng. Educ. Transform., vol. 28, no. 2, pp. 113–121, 2015.
C. Kazimoglu, M. Kiernan, L. Bacon, and L. Mackinnon, "A serious game for developing computational thinking and learning introductory computer programming,” Procedia-Social Behav. Sci., vol. 47, pp. 1991–1999, 2012.
Y. S. Wong, M. Y. M. Hayati, W. H. Tan, and L. C. Yap, "A Game-Based Learning Assessment Framework for Learning Ubiquitous Computational Thinking,” in The Impact of the 4th Industrial Revolution on Engineering Education: Proceedings of the 22nd International Conference on Interactive Collaborative Learning (ICL2019)–Volume 2 22, 2020, pp. 607–615.
V. Å tuikys, R. BurbaitÄ—, K. Bespalova, and G. Ziberkas, "Model-driven processes and tools to design robot-based generative learning objects for computer science education,” Sci. Comput. Program., vol. 129, pp. 48–71, 2016.
L. Sadath and S. Gill, "ETHICS and SSM”A critical human element evaluation in software engineering using the NIMSAD framework,” in 2017 International Conference on Infocom Technologies and Unmanned Systems (Trends and Future Directions)(ICTUS), 2017, pp. 370–375.
Y. Sedelmaier and D. Landes, "A multi-perspective framework for evaluating software engineering education by assessing students' competencies: SECAT”A software engineering competency assessment tool,” in 2014 IEEE Frontiers in Education Conference (FIE) Proceedings, 2014, pp. 1–8.
P. Panwong and K. Kemavuthanon, "Problem-based learning framework for junior software developer: Empirical study for computer programming students,” Wirel. Pers. Commun., vol. 76, pp. 603–613, 2014.
J. Luo, F. Lu, and T. Wang, "A multi-dimensional assessment model and its application in E-learning courses of computer science,” in Proceedings of the 21st Annual Conference on Information Technology Education, 2020, pp. 187–193.
H. Casanova, R. Tanaka, W. Koch, and R. F. da Silva, "Teaching parallel and distributed computing concepts in simulation with wrench,” J. Parallel Distrib. Comput., vol. 156, pp. 53–63, 2021.
H. W. Alomari, V. Ramasamy, J. D. Kiper, and G. Potvin, "A User Interface (UI) and User eXperience (UX) evaluation framework for cyberlearning environments in computer science and software engineering education,” Heliyon, vol. 6, no. 5, 2020.
B. M. Olds and R. L. Miller, "An assessment matrix for evaluating engineering programs,” J. Eng. Educ., vol. 87, no. 2, pp. 173–178, 1998.
A. Cun, S. Abramovich, and J. M. Smith, "An assessment matrix for library makerspaces,” Libr. Inf. Sci. Res., vol. 41, no. 1, pp. 39–47, 2019.
F. Zulfa, D. O. Siahaan, R. Fauzan, and E. Triandini, "Inter-Structure and Intra-Structure Similarity of Use Case Diagram using Greedy Graph Edit Distance,” 2020 2nd Int. Conf. Cybern. Intell. Syst. ICORIS 2020, pp. 3–8, 2020.
R. Fauzan, D. Siahaan, S. Rochimah, and E. Triandini, "Automated Class Diagram Assessment using Semantic and Structural Similarities,” Int. J. Intell. Eng. Syst., vol. 14, no. 2, 2021.
E. Crawley, J. Malmqvist, S. Ostlund, D. Brodeur, and K. Edstrom, "Rethinking engineering education,” CDIO approach, vol. 302, no. 2, pp. 60–62, 2007.
C. Lassudrie, J. Kontio, and S. Rouvrais, "Managing the Continuous Improvement Loop of Educational Systems: Students as key actors in program evaluation,” in CDIO 2013: 9th International conference: Engineering Leadership in Innovation and Design., 2013.
K. Beck, Extreme programming explained: embrace change. addison-wesley professional, 2000.
K. Avruch, "Culture and negotiation pedagogy,” Negot. J., vol. 16, no. 4, pp. 339–346, 2000.
D. A. Schön, Educating the reflective practitioner: Toward a new design for teaching and learning in the professions. Jossey-Bass, 1987.
D. Bodemer and J. Dehler, "Group awareness in CSCL environments,” Comput. Human Behav., vol. 27, no. 3, pp. 1043–1045, 2011.
P. Kristiansen and R. Rasmussen, Building a better business using the Lego serious play method. John Wiley & Sons, 2014.
J.-P. Steghöfer, "Providing a baseline in software process improvement education with lego scrum simulations,” in Proceedings of the 40th International Conference on Software Engineering: Software Engineering Education and Training, 2018, pp. 126–135.
W. Damon and E. Phelps, "Critical distinctions among three approaches to peer education,” Int. J. Educ. Res., vol. 13, no. 1, pp. 9–19, 1989.
G. R. Adams, "Social competence during adolescence: Social sensitivity, locus of control, empathy, and peer popularity,” J. Youth Adolesc., vol. 12, no. 3, pp. 203–211, 1983.
M. Kuhrmann and J. Münch, "When teams go crazy: An environment to experience group dynamics in software project management courses,” in Proceedings of the 38th International Conference on Software Engineering Companion, 2016, pp. 412–421.
H. Burden, J.-P. Steghöfer, and O. H. Svensson, "Facilitating entrepreneurial experiences through a software engineering project course,” in 2019 IEEE/ACM 41st International Conference on Software Engineering: Software Engineering Education and Training (ICSE-SEET), 2019, pp. 28–37.
D. D. Dixson and F. C. Worrell, "Formative and summative assessment in the classroom,” Theory Pract., vol. 55, no. 2, pp. 153–159, 2016.
J. D. Kibble, "Best practices in summative assessment,” Adv. Physiol. Educ., vol. 41, no. 1, pp. 110–119, 2017.
R. E. Bennett, "Formative assessment: A critical review,” Assess. Educ. Princ. policy Pract., vol. 18, no. 1, pp. 5–25, 2011.
K. L. McClarty, A. Orr, P. M. Frey, R. P. Dolan, V. Vassileva, and A. McVay, "A literature review of gaming in education,” Gaming Educ., vol. 1, no. 1, pp. 1–35, 2012.
D. Dziob, "Board game in physics classes”A proposal for a new method of student assessment,” Res. Sci. Educ., vol. 50, no. 3, pp. 845–862, 2020.
F. Bellotti, B. Kapralos, K. Lee, P. Moreno-Ger, and R. Berta, "Assessment in and of serious games: An overview,” Adv. human-computer Interact., vol. 2013, p. 1, 2013.
N. Jayaratna, Understanding and evaluating methodologies: NIMSAD, a systematic framework. McGraw-Hill, Inc., 1994.
H. Armstrong and N. Jayaratna, "Applying the NIMSAD Framework to Evaluating IA Education Projects,” in Proceedings from the Ninth Colloquium for Information Systems Security Education (CISSE9), 2005, pp. 124–129.
J. Koskinen, H. Lintinen, H. Sivula, and T. Tilus, "Evaluation of software modernization estimation methods using NIMSAD meta framework,” Publ. Inf. Technol. Res. Inst., vol. 15, 2004.
H. Casanova, S. Pandey, J. Oeth, R. Tanaka, F. Suter, and R. F. Da Silva, "Wrench: A framework for simulating workflow management systems,” in 2018 IEEE/ACM Workflows in Support of Large-Scale Science (WORKS), 2018, pp. 74–85.
H. Casanova et al., "Developing accurate and scalable simulators of production workflow management systems with wrench,” Futur. Gener. Comput. Syst., vol. 112, pp. 162–175, 2020.
H. Athaya, R. D. A. Nadir, D. Indra Sensuse, K. Kautsarina, and R. R. Suryono, "Moodle Implementation for E-Learning: A Systematic Review,” in Proceedings of the 6th International Conference on Sustainable Information Engineering and Technology, 2021, pp. 106–112.
S. H. P. W. Gamage, J. R. Ayres, and M. B. Behrend, "A systematic review on trends in using Moodle for teaching and learning,” Int. J. STEM Educ., vol. 9, no. 1, pp. 1–24, 2022.
M. D. Ayastuy and D. Torres, "Adaptive gamification in collaborative location collecting systems: a case of traveling behavior detection,” J. Comput. Sci. Technol., vol. 22, no. 1, pp. e05–e05, 2022.
J. Miguel, A. Chimuris Gimenez, N. Garrido, M. Bassi, G. Velazquez, and M. D. Panizzi, "State of the art on the conceptual modeling of serious games through a systematic mapping of the literature,” J. Comput. Sci. Technol., vol. 22, 2022.
Copyright (c) 2023 The Authors. Published by Universitas Airlangga.
This work is licensed under a Creative Commons Attribution 4.0 International License.
Authors who publish with this journal agree to the following terms:
All accepted papers will be published under a Creative Commons Attribution 4.0 International (CC BY 4.0) License. Authors retain copyright and grant the journal right of first publication. CC-BY Licenced means lets others to Share (copy and redistribute the material in any medium or format) and Adapt (remix, transform, and build upon the material for any purpose, even commercially).