Skip to content

Commit

Permalink
[SPARK-5074] [CORE] [TESTS] Fix the flakey test 'run shuffle with map…
Browse files Browse the repository at this point in the history
… stage failure' in DAGSchedulerSuite

Test failure: https://amplab.cs.berkeley.edu/jenkins/job/Spark-Master-SBT/AMPLAB_JENKINS_BUILD_PROFILE=hadoop2.2,label=centos/2240/testReport/junit/org.apache.spark.scheduler/DAGSchedulerSuite/run_shuffle_with_map_stage_failure/

This is because many tests share the same `JobListener`. Because after each test, `scheduler` isn't stopped. So actually it's still running. When running the test `run shuffle with map stage failure`, some previous test may trigger [ResubmitFailedStages](https://github.com/apache/spark/blob/ebc25a4ddfe07a67668217cec59893bc3b8cf730/core/src/main/scala/org/apache/spark/scheduler/DAGScheduler.scala#L1120) logic, and report `jobFailed` and override the global `failure` variable.

This PR uses `after` to call `scheduler.stop()` for each test.

Author: zsxwing <[email protected]>

Closes apache#5903 from zsxwing/SPARK-5074 and squashes the following commits:

1e6f13e [zsxwing] Fix the flakey test 'run shuffle with map stage failure' in DAGSchedulerSuite
  • Loading branch information
zsxwing authored and jeanlyn committed May 28, 2015
1 parent 1da5f7a commit f5e304c
Showing 1 changed file with 6 additions and 1 deletion.
Original file line number Diff line number Diff line change
Expand Up @@ -174,6 +174,10 @@ class DAGSchedulerSuite
dagEventProcessLoopTester = new DAGSchedulerEventProcessLoopTester(scheduler)
}

after {
scheduler.stop()
}

override def afterAll() {
super.afterAll()
}
Expand Down Expand Up @@ -261,8 +265,9 @@ class DAGSchedulerSuite
override def taskSucceeded(partition: Int, value: Any) = numResults += 1
override def jobFailed(exception: Exception) = throw exception
}
submit(new MyRDD(sc, 0, Nil), Array(), listener = fakeListener)
val jobId = submit(new MyRDD(sc, 0, Nil), Array(), listener = fakeListener)
assert(numResults === 0)
cancel(jobId)
}

test("run trivial job") {
Expand Down

0 comments on commit f5e304c

Please sign in to comment.