Task parallelism is a form of parallelization of computer code across multiple processors. It involves running many different tasks at the same time on the same data, such as pipelining which consists of moving a single set of data through a series of separate tasks. It is distinct from data parallelism which involves running the same task on different components of data.
Carnegie Mellon University
Fall 2019
This course provides a deep dive into the inner workings of computer systems, enhancing students' effectiveness as programmers. Topics span machine-level code, performance evaluation, computer arithmetic, memory management, and networking protocols. It serves as a foundation for advanced courses like compilers and operating systems.
No concepts data
+ 22 more concepts