Data-Intensive Computing

Data-intensive computing

Data-intensive computing is a type of parallel computing that processes large amounts of data, usually terabytes or petabytes in size. It focuses on I/O and manipulation of data rather than computational requirements. This type of computing is used to process "big data".

1 courses cover this concept

15-440 Distributed Systems

Carnegie Mellon University

Fall 2020

A course offering both theoretical understanding and practical experience in distributed systems. Key themes include concurrency, scheduling, network communication, and security. Real-world protocols and paradigms like distributed filesystems, RPC, MapReduce are studied. Course utilizes C and Go programming languages.

No concepts data

+ 34 more concepts