Complexity Measures for Map-Reduce, and Comparison to Parallel Computing
Publication
, Report
Goel, A; Munagala, K
November 28, 2012
The programming paradigm Map-Reduce and its main open-source implementation, Hadoop, have had an enormous impact on large scale data processing. Our goal in this expository writeup is two-fold: first, we want to present some complexity measures that allow us to talk about Map-Reduce algorithms formally, and second, we want to point out why this model is actually different from other models of parallel programming, most notably the PRAM (Parallel Random Access Memory) model. We are looking for complexity measures that are detailed enough to make fine-grained distinction between different algorithms, but which also abstract away many of the implementation details.
Duke Scholars
Publication Date
November 28, 2012
Citation
APA
Chicago
ICMJE
MLA
NLM
Goel, A., & Munagala, K. (2012). Complexity Measures for Map-Reduce, and Comparison to Parallel Computing.
Goel, Ashish, and Kamesh Munagala. “Complexity Measures for Map-Reduce, and Comparison to Parallel Computing,” November 28, 2012.
Goel A, Munagala K. Complexity Measures for Map-Reduce, and Comparison to Parallel Computing. 2012 Nov.
Goel, Ashish, and Kamesh Munagala. Complexity Measures for Map-Reduce, and Comparison to Parallel Computing. 28 Nov. 2012.
Goel A, Munagala K. Complexity Measures for Map-Reduce, and Comparison to Parallel Computing. 2012 Nov.
Publication Date
November 28, 2012