Pipeline (Unix)

Pipeline (Unix)

In Unix-like operating systems, a pipeline is a mechanism for inter-process communication that chains together a set of processes by their standard streams, allowing the output of each process to be passed as input to the next. This concept, championed by Douglas McIlroy at Bell Labs during the development of Unix, allows for more clarity and simplicity in the system due to its "hiding of internals". The article focuses on anonymous pipes, which are unidirectional channels that buffer data written by one process until it is read by the next, disappearing once the processes are completed.

3 courses cover this concept

CS 110: Principles of Computer Systems

Stanford University

Winter 2022

CS 110 delves into advanced computer systems and program construction, focusing on designing large systems, software that spans multiple machines, and parallel computing. This course builds upon CS107 and requires good knowledge of C, C++, Unix, GDB, Valgrind, and Make. It covers Linux filesystems, multiprocessing, threading, networking, and more.

No concepts data

+ 28 more concepts

CS 110: Principles of Computer Systems

Stanford University

Summer 2021

Requiring familiarity with C/C++ and Unix/Linux, delves into computer systems principles. Students will engage with a blend of C and C++ to interface with system resources and manage complex projects. The course covers a broad range of topics including filesystems, multiprocessing, synchronization, networking, and MapReduce.

No concepts data

+ 24 more concepts

CSCI 0300: Fundamentals of Computer Systems

Brown University

Spring 2023

Introductory course covering computer system fundamentals including machine organization, systems programming in C/C++, operating systems concepts, isolation, security, virtualization, concurrency, and distributed systems. Projects involve implementing core OS functionality.

No concepts data

+ 32 more concepts