Docs
uhadoop
FAQs
Common Task ERROR

Common Task ERROR

java.lang.OutOfMemoryError: Java heap space

Reason: The memory allocated for a single task is relatively low, or the task data volume is small, causing the task OOM.

Solution:

a. Executor side OOM: When submitting the task, try to increase the task parameter —executor-memory

b. Driver side OOM: a. Try to increase the task parameter —driver-memory; b. Reduce the task parallelism, modify /home/hadoop/spark/conf/spark-defaults.conf, add spark.default.parallelism 40

java.lang.ClassNotFoundException

Reason: When submitting the task, the relevant jar package is missing, specifically, you can analyze which package is missing based on the hints after java.lang.ClassNotFoundException

Solution:

a. Specify —jars when submitting the task with spark-submit, separate multiple packages with commas

b. When there are many packages, you can put the packages into a directory and specify spark.executor.extraClassPath or spark.driver.extraClassPath in /home/hadoop/spark/conf/spark-defaults.conf to point to this directory

User root cannot submit applications to queue root.root

Reason: The task submitter does not have submission rights for this queue or the default queue

Solution: Add —queue when submitting the task to specify the queue with permissions