A simple implementation of MapReduce distributed system and programming model.
MapReduce is a programming model and distributed system devised by Google in the early 2000s to do parallel large scale computation of large data sets.
A large input is partitioned into multiple input splits that are sent into worker machines containing the map function. The map function reads the input and each input entry is associated with a unique key.
All (key, entry) data are saved locally as intermediate data and sent over in the network to worker machines containing the reduce function. Entries with the same key are usually combined in the reduce function.
An example of count of URL access frequency from the MapReduce paper:
The map function processes logs of web page requests and outputs ⟨URL, 1⟩. The reduce function adds together all values for the same URL and emits a ⟨URL, total count⟩ pair.