Previous | Next --- Slide 3 of 50
Back to Lecture Thumbnails
cassiez

The fact that large amounts of parallelism requires utilizing large number of cores reveals one big advantage of using data-parallel thinking and computation framework such as Spark and MapReduce: the programmers don't need to worry too much about managing underlying clusters (for example, how do they communicate to pass data around, how to handle the failure of a sinlge node...) The technical details of how to managing large numbers of machines are abstracted away from the programmers.

ckk

I think the first point repeats a recurring theme of the course. The programmer must explicitly tell the compiler/processor which parts of their code can be parallelized

Please log in to leave a comment.