Apache Spark 面试题与答案
Question: How does Spark handle data locality optimization?Answer: Spark aims to schedule tasks on nodes that have a copy of the data to minimize data transfer over the network. This is achieved by using data locality-aware task scheduling.Example:
|
保存以便复习
收藏此条目、标记为困难题,或将其加入复习集合。
这有帮助吗? 是 否
用户评价最有帮助的内容:
- What is the purpose of the Spark SQL module?
- Explain the difference between narrow and wide transformations in Spark.