Computer Science
>
>

15-213/18-213/14-513/15-513/18-613 Introduction to Computer Systems

Fall 2019

Carnegie Mellon University

This course provides a deep dive into the inner workings of computer systems, enhancing students' effectiveness as programmers. Topics span machine-level code, performance evaluation, computer arithmetic, memory management, and networking protocols. It serves as a foundation for advanced courses like compilers and operating systems.

Course Page

Overview

The ICS course provides a programmer's view of how computer systems execute programs, store information, and communicate. It enables students to become more effective programmers, especially in dealing with issues of performance, portability and robustness. It also serves as a foundation for courses on compilers, networks, operating systems, and computer architecture, where a deeper understanding of systems-level issues is required. Topics covered include: machine-level code and its generation by optimizing compilers, performance evaluation and optimization, computer arithmetic, memory organization and management, networking technology and protocols, and supporting concurrent computation.

Course Syllabus

Prerequisites

No data.

Learning objectives

Our aim in the course is to help you become a better programmer by teaching you the basic concepts underlying all computer systems. We want you to learn what really happens when your programs run, so that when things go wrong (as they always do) you will have the intellectual tools to solve the problem.

Why do you need to understand computer systems if you do all of your programming in high level languages? In most of computer science, we’re pushed to make abstractions and stay within their frameworks. But, any abstraction ignores effects that can become critical. As an analogy, Newtonian mechanics ignores relativistic effects. The Newtonian abstraction is completely appropriate for bodies moving at less than 0.1c, but higher speeds require working at a greater level of detail.

The following “realities” are some of the major areas where the abstractions you’ve learned in previous classes break down:

  1. Int’s are not integers, Float’s are not reals. Our finite representations of numbers have significant limitations, and because of these limitations we sometimes have to think in terms of bit-level representations.
  2. You’ve got to know assembly language. Even if you never write programs in assembly, The behavior of a program cannot be understood sometimes purely based on the abstraction of a high-level language. Further, understanding the effects of bugs requires familiarity with the machine-level model.
  3. Memory matters. Computer memory is not unbounded. It must be allocated and managed. Memory referencing errors are especially pernicious. An erroneous updating of one object can cause a change in some logically unrelated object. Also, the combination of caching and virtual memory provides the functionality of a uniform unbounded address space, but not the performance.
  4. There is more to performance than asymptotic complexity. Constant factors also matter. There are systematic ways to evaluate and improve program performance.
  5. Computers do more than execute instructions. They also need to get data in and out and they interact with other systems over networks.

By the end of the course, you will understand these “realities” in some detail. As a result, you will be prepared to take any of the upper-level systems classes at Carnegie Mellon (both CS and ECE). Even more important, you will have learned skills and knowledge that will help you throughout your career.

In detail, we set forth the following learning objectives, as activities you should be able to do after completing the course:

  1. Explain common bit-level representations of numeric values (unsigned, two’s complement, floating point) and the consequent mathematical properties of arithmetic and bitlevel operations on them.
  2. Recognize the relation between programs expressed in C and in assembly code, including the implementation of expressions, control, procedures, and data structures.
  3. Demonstrate ability to understand basic intention of a program through its binary representation and apply these skills to debugging programs.
  4. Investigate the programmer’s interaction with the underlying system through the different APIs and abstractions, including system support for process and thread control, virtual memory, and networking.
  5. Analyze the consequences of imperfect system usage, such as poor memory and CPU performance, crashes, and security vulnerabilities.
  6. Apply tools, both standard and self-developed, that will aid program development, including compilers, code analyzers, debuggers, consistency checkers, and profilers.
  7. Apply these analytic and tool-use abilities to create reliable and efficient programs exercising the different components of a modern computing system.
  8. Understand the sources of conflict that can arise when multiple threads of execution share resources, and demonstrate the ability to use synchronization constructs to mediate those conflicts.

Textbooks and other notes

Other courses in Computer Systems

CS 110: Principles of Computer Systems

Winter 2022

Stanford University

CSE 351 The HW/SW Interface

Autumn 2022

University of Washington

CS 107e Computer Systems from the Ground Up

Winter 2023

Stanford University

CS 107A: Problem-solving Lab for CS 107

Autumn 2022

Stanford University

ELE/COS 475 Computer Architecture

Fall 2019

Princeton University

Courseware availability

Lecture slides and handouts available in Schedule

No videos available

No assignements available

Lecture codes available at Schedule

Resources available at Resources

Covered concepts