Prepare Interview

Mock Exams

Make Homepage

Bookmark this page

Subscribe Email Address

Question: How does Spark handle fault tolerance in RDDs?
Answer: Spark achieves fault tolerance through lineage information (DAG) and recomputing lost data from the original source. If a partition of an RDD is lost, Spark can recompute it using the lineage information.

Example:

val resilientRDD = originalRDD.filter(x => x > 0)
Is it helpful? Yes No

Most helpful rated by users:

©2025 WithoutBook