Cognitive Complexity of Comprehending Computer Programs

Thumbnail Image
Journal Title
Journal ISSN
Volume Title
School of Science | Doctoral thesis (article-based) | Defence date: 2020-06-29
Degree programme
198 + app. 75
Aalto University publication series DOCTORAL DISSERTATIONS, 100/2020
Instructional designers must consider learners' learning trajectories and design tasks that are neither too hard nor too easy for them, sequencing tasks from less to more complex ones. Most efforts in programming assessment have been directed to code writing. However, programming is a multi-faceted skill, including precursory skills such as the comprehension of programs, which recent studies suggest having many interacting elements. An essential part of assessment is characterization of what makes a program unique and how to estimate learner's previous knowledge. When programs are different enough, instructors can intuitively compare the effort demanded of the learners. However, when the difference is subtle, instructors struggle to evaluate program's cognitive demands. Complexity is the metric used to describe the cognitive demands of a task. While research of computational and software engineering metrics of complexity are well-established, little research has been devoted to evaluating the complexity of comprehending programs from learners' perspective. In general, the taxonomies used by Computing Education to evaluate complexity do not evaluate the core content used in programming tasks. While subjective evaluations of difficulty proved to be useful in empirical evaluations, researchers have advocated for a complementary \emph{a priori} analytical metric of complexity to support instructional design. To alleviate such gaps, we seek to develop and present the necessary toolset to define and evaluate the cognitive complexity of comprehending computer programs. We introduce a novel conceptualization, the Rules of Program Behavior, which augment previous notional machine research and offer guidelines to communicate semantics instruction among practitioners and set the expectations of possible learners' mental models. We designed and partially validated a self-evaluation instrument to assess prior programming knowledge, inspired by frameworks successfully used in linguistics. These tools serve as input to our theoretical framework of cognitive complexity, which is based on educational psychology theories. The framework analyzes the cognitive elements present in a given program, and the way these elements are intertwined, extracting measurable aspects of complexity. Finally, we investigated instructors' perspectives of program comprehension and presented activities to foster program comprehension and how such activities could be sequenced to create learning trajectories. Our framework and tools laid the foundation to evaluate the cognitive complexity of program comprehension. We expect that the results of this thesis could support the design of assessment instruments, curricula, and programming languages. Our work particularly fits frameworks that holistically integrate skills and knowledge using authentic tasks while keeping learners' cognitive load in check. We believe that our results can be adapted to other aspects of programming and could help researchers to generate and test hypotheses related to program comprehension.
Supervising professor
Malmi, Lauri, Prof., Aalto University, Department of Computer Science, Finland
Thesis advisor
Sorva, Juha, Dr., Aalto University, Department of Computer Science, Finland
complexity, plans, self-evaluation, notional machines, program comprehension
Other note
  • [Publication 1]: Duran, Rodrigo. Design of Rules of Program Behavior for Teaching. In Submitted for publication in February 2020, 9 pages
  • [Publication 2]: Duran, Rodrigo; Sorva, Juha; Leite, Sofia. Towards an Analysis of Program Complexity From a Cognitive Perspective. In Proceedings of the 2018 ACM Conference on International Computing Education Research (ICER ’18), Espoo, Finland, pages 21-30, August 2018.
    DOI: 10.1145/3230977.3230986 View at publisher
  • [Publication 3]: Duran, Rodrigo Silva; Rybicki, Jan-Mikael; Hellas, Arto; Suoranta, Sanna. Towards a Common Instrument for Measuring Prior Programming Knowledge. In Proceedings of the 2019 ACM Conference on Innovation and Technology in Computer Science Education, Aberdeen, UK, pages 443-449, 2019.
    DOI: 10.1145/3304221.3319755 View at publisher
  • [Publication 4]: Duran, Rodrigo; Rybicki, Jan-Mikael; Sorva, Juha; Hellas, Arto. Exploring the Value of Student Self-Evaluation in Introductory Programming. In Proceedings of the 2019 ACM Conference on International Computing Education Research (ICER ’19), Toronto, CA, pages 121-130, 2019.
    DOI: 10.1145/3291279.3339407 View at publisher
  • [Publication 5]: Izu, Cruz; Schulte, Carsten; Aggarwal, Ashish; Cutts, Quintin; Duran, Rodrigo; Gutica, Mirela; Heinemann, Birte; Kraemer, Eileen; Lonati, Violetta; Mirolo, Claudio; Weeda, Renske. Fostering Program Comprehension in Novice Programmers - Learning Activities and Learning Trajectories. In Proceedings of the 2019 ITiCSE Conference on Working Group Reports, Aberdeen, UK, pages 1-26, 2019.
    DOI: 10.1145/3344429.3372501 View at publisher