Spark Most Typical Interview Questions List



Apache SPARK Interview Questions List

    1. Why  RDD resilient?
    2. Difference between Persist and Cache?
    3. Difference between Lineage and DAG?
    4. What is is narrow and wide transformations?
    5. What are Shared variables and it uses?
    6. How to define custom accumulator?
    7. If we have 50 GB memory and 100 GB data, how spark will process it?
    8. How to create UDFs in Spark?
    9. How to use hive UDFs in Spark?
    10. What are accumulators and broadcast variables?
    11. How to decide various parameter values in Spark – Submit?
    12. Difference between Coalesce and Re partition?
    13. Difference between RDD DATA FRAME and DATA SET. When to use one?
    14. What is Data Skew and  how to fix it?
    15. Why shouldn’t we use group by transformation in Spark?
    16. How to do Map side join in Spark?
    1. What Challenges  are faced in Spark Project?
    2. Use of map, flat map, map partition, for each for each partition ?
    3. What is Pair RDD ? When to use them?
    4. Performance optimization techniques in Spark?
    5. Difference between Cluster and Client mode?
    6. How to capture log in client mode and Cluster mode?
    7. What happens if a worker node is dead?
    8. What types of file format Spark supports? Which of them are most suitable for our organization need  ?



  1. Difference between reduceByKey() and groupByKey()?
  2. Difference between Spark 1 and Spark 2?
  3. How do you debug Spark jobs ?
  4. Difference between Var an Val ?
  5. What size of file do you use for development?
  6. How long will take to run your script in production ?
  7. Perform joins using RDD’s ?
  8. How do run your job in Spark?
  9. What is difference between Spark data frame and data set ?
  10. How data sets are type safe?
  11. What are sink processors?
  12. Lazy evaluation in Spark and its benefits?
  13. After Spark – Submit,  Whats’s process run behind of application?
  14. How to decide no.of stages in Spark job?

Above questions are related to Spark developers for experienced and beginners.


Leave a Reply

Your email address will not be published. Required fields are marked *