Adds a listener in the form of a Scala closure to be executed on task completion.
Adds a listener in the form of a Scala closure to be executed on task completion. This will be called in all situations - success, failure, or cancellation. An example use is for HadoopRDD to register a callback to close the input stream.
Adds a (Java friendly) listener to be executed on task completion.
Adds a (Java friendly) listener to be executed on task completion. This will be called in all situation - success, failure, or cancellation. An example use is for HadoopRDD to register a callback to close the input stream.
How many times this task has been attempted.
How many times this task has been attempted. The first task attempt will be assigned attemptNumber = 0, and subsequent attempts will have increasing attempt numbers.
Returns true if the task has completed.
Returns true if the task has been killed.
Returns true if the task is running locally in the driver program.
Returns true if the task is running locally in the driver program.
The ID of the RDD partition that is computed by this task.
The ID of the stage that this task belong to.
An ID that is unique to this task attempt (within the same SparkContext, no two task attempts will share the same attempt ID).
An ID that is unique to this task attempt (within the same SparkContext, no two task attempts will share the same attempt ID). This is roughly equivalent to Hadoop's TaskAttemptID.
::DeveloperApi::
::DeveloperApi::
Adds a callback function to be executed on task completion.
Adds a callback function to be executed on task completion. An example use is for HadoopRDD to register a callback to close the input stream. Will be called in any situation - success, failure, or cancellation.
Callback function.
(Since version 1.2.0) use addTaskCompletionListener
(Since version 1.3.0) use attemptNumber
(Since version 1.2.0) use isRunningLocally
Contextual information about a task which can be read or mutated during execution. To access the TaskContext for a running task, use: