{"payload":{"feedbackUrl":"https://github.com/orgs/community/discussions/53140","repo":{"id":165840641,"defaultBranch":"spark-3.5","name":"spark","ownerLogin":"Telefonica","currentUserCanPush":false,"isFork":true,"isEmpty":false,"createdAt":"2019-01-15T11:39:47.000Z","ownerAvatar":"https://avatars.githubusercontent.com/u/1536176?v=4","public":true,"private":false,"isOrgOwned":true},"refInfo":{"name":"","listCacheKey":"v0:1695118401.0","currentOid":""},"activityList":{"items":[{"before":"69305fb69c50ace21be2340a84c1366d5796c8cc","after":"a3102cf45c926d93c390aad6da043e6ce71abfee","ref":"refs/heads/spark-3.5","pushedAt":"2023-09-19T14:19:42.000Z","pushType":"force_push","commitsCount":0,"pusher":{"login":"pradomota","name":"Carlos del Prado Mota","path":"/pradomota","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/11295347?s=80&v=4"},"commit":{"message":"fix(tef): add retry for k8s executor and driver pods operations & random driver cm name","shortMessageHtmlLink":"fix(tef): add retry for k8s executor and driver pods operations & ran…"}},{"before":"2514fb2ed0c2132c6c6db6264298fe547c809a8b","after":"69305fb69c50ace21be2340a84c1366d5796c8cc","ref":"refs/heads/spark-3.5","pushedAt":"2023-09-19T10:49:57.000Z","pushType":"force_push","commitsCount":0,"pusher":{"login":"pradomota","name":"Carlos del Prado Mota","path":"/pradomota","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/11295347?s=80&v=4"}},{"before":"921ccfa17ba304ff0763b4c34dc032b2f09a1392","after":"2514fb2ed0c2132c6c6db6264298fe547c809a8b","ref":"refs/heads/spark-3.5","pushedAt":"2023-09-19T10:21:36.000Z","pushType":"force_push","commitsCount":0,"pusher":{"login":"pradomota","name":"Carlos del Prado Mota","path":"/pradomota","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/11295347?s=80&v=4"}},{"before":null,"after":"921ccfa17ba304ff0763b4c34dc032b2f09a1392","ref":"refs/heads/spark-3.5","pushedAt":"2023-09-19T10:13:21.000Z","pushType":"branch_creation","commitsCount":0,"pusher":{"login":"pradomota","name":"Carlos del Prado Mota","path":"/pradomota","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/11295347?s=80&v=4"}},{"before":"97349986daa5aeacaa320f7187681a72c4830b13","after":"eec090755aa5b7e6048fc004264a8f5d3591df1a","ref":"refs/heads/master","pushedAt":"2023-09-19T09:24:18.000Z","pushType":"push","commitsCount":2519,"pusher":{"login":"pradomota","name":"Carlos del Prado Mota","path":"/pradomota","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/11295347?s=80&v=4"},"commit":{"message":"[SPARK-45211][CONNECT] Eliminated ambiguous references in `CloseableIterator#apply` to fix Scala 2.13 daily test\n\n### What changes were proposed in this pull request?\nThis pr eliminated an ambiguous references in `org.apache.spark.sql.connect.client.CloseableIterator#apply` function to make the test case `abandoned query gets INVALID_HANDLE.OPERATION_ABANDONED error` can test pass with Scala 2.13.\n\n### Why are the changes needed?\n`abandoned query gets INVALID_HANDLE.OPERATION_ABANDONED error` failed in the daily test of Scala 2.13:\n- https://github.com/apache/spark/actions/runs/6215331575/job/16868131377\n\n\"image\"\n\n### Does this PR introduce _any_ user-facing change?\nNo\n\n### How was this patch tested?\n- Pass GitHub Actions\n- Manual check\n\nrun\n\n```\ndev/change-scala-version.sh 2.13\nbuild/sbt \"connect/testOnly org.apache.spark.sql.connect.execution.ReattachableExecuteSuite\" -Pscala-2.13\n```\n\n**Before**\n\n```\n[info] ReattachableExecuteSuite:\n[info] - reattach after initial RPC ends (2 seconds, 258 milliseconds)\n[info] - raw interrupted RPC results in INVALID_CURSOR.DISCONNECTED error (30 milliseconds)\n[info] - raw new RPC interrupts previous RPC with INVALID_CURSOR.DISCONNECTED error (21 milliseconds)\n[info] - client INVALID_CURSOR.DISCONNECTED error is retried when rpc sender gets interrupted (602 milliseconds)\n[info] - client INVALID_CURSOR.DISCONNECTED error is retried when other RPC preempts this one (637 milliseconds)\n[info] - abandoned query gets INVALID_HANDLE.OPERATION_ABANDONED error *** FAILED *** (70 milliseconds)\n[info] Expected exception org.apache.spark.SparkException to be thrown, but java.lang.StackOverflowError was thrown (ReattachableExecuteSuite.scala:172)\n[info] org.scalatest.exceptions.TestFailedException:\n[info] at org.scalatest.Assertions.newAssertionFailedException(Assertions.scala:472)\n[info] at org.scalatest.Assertions.newAssertionFailedException$(Assertions.scala:471)\n[info] at org.scalatest.funsuite.AnyFunSuite.newAssertionFailedException(AnyFunSuite.scala:1564)\n[info] at org.scalatest.Assertions.intercept(Assertions.scala:756)\n[info] at org.scalatest.Assertions.intercept$(Assertions.scala:746)\n[info] at org.scalatest.funsuite.AnyFunSuite.intercept(AnyFunSuite.scala:1564)\n[info] at org.apache.spark.sql.connect.execution.ReattachableExecuteSuite.$anonfun$new$18(ReattachableExecuteSuite.scala:172)\n[info] at org.apache.spark.sql.connect.execution.ReattachableExecuteSuite.$anonfun$new$18$adapted(ReattachableExecuteSuite.scala:168)\n[info] at org.apache.spark.sql.connect.SparkConnectServerTest.withCustomBlockingStub(SparkConnectServerTest.scala:222)\n[info] at org.apache.spark.sql.connect.SparkConnectServerTest.withCustomBlockingStub$(SparkConnectServerTest.scala:216)\n[info] at org.apache.spark.sql.connect.execution.ReattachableExecuteSuite.withCustomBlockingStub(ReattachableExecuteSuite.scala:30)\n[info] at org.apache.spark.sql.connect.execution.ReattachableExecuteSuite.$anonfun$new$16(ReattachableExecuteSuite.scala:168)\n[info] at org.apache.spark.sql.connect.execution.ReattachableExecuteSuite.$anonfun$new$16$adapted(ReattachableExecuteSuite.scala:151)\n[info] at org.apache.spark.sql.connect.SparkConnectServerTest.withClient(SparkConnectServerTest.scala:199)\n[info] at org.apache.spark.sql.connect.SparkConnectServerTest.withClient$(SparkConnectServerTest.scala:191)\n[info] at org.apache.spark.sql.connect.execution.ReattachableExecuteSuite.withClient(ReattachableExecuteSuite.scala:30)\n[info] at org.apache.spark.sql.connect.execution.ReattachableExecuteSuite.$anonfun$new$15(ReattachableExecuteSuite.scala:151)\n[info] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18)\n[info] at org.scalatest.enablers.Timed$$anon$1.timeoutAfter(Timed.scala:127)\n[info] at org.scalatest.concurrent.TimeLimits$.failAfterImpl(TimeLimits.scala:282)\n[info] at org.scalatest.concurrent.TimeLimits.failAfter(TimeLimits.scala:231)\n[info] at org.scalatest.concurrent.TimeLimits.failAfter$(TimeLimits.scala:230)\n[info] at org.apache.spark.SparkFunSuite.failAfter(SparkFunSuite.scala:69)\n[info] at org.apache.spark.SparkFunSuite.$anonfun$test$2(SparkFunSuite.scala:155)\n[info] at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85)\n[info] at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83)\n[info] at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)\n[info] at org.scalatest.Transformer.apply(Transformer.scala:22)\n[info] at org.scalatest.Transformer.apply(Transformer.scala:20)\n[info] at org.scalatest.funsuite.AnyFunSuiteLike$$anon$1.apply(AnyFunSuiteLike.scala:226)\n[info] at org.apache.spark.SparkFunSuite.withFixture(SparkFunSuite.scala:227)\n[info] at org.scalatest.funsuite.AnyFunSuiteLike.invokeWithFixture$1(AnyFunSuiteLike.scala:224)\n[info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTest$1(AnyFunSuiteLike.scala:236)\n[info] at org.scalatest.SuperEngine.runTestImpl(Engine.scala:306)\n[info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest(AnyFunSuiteLike.scala:236)\n[info] at org.scalatest.funsuite.AnyFunSuiteLike.runTest$(AnyFunSuiteLike.scala:218)\n[info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterEach$$super$runTest(SparkFunSuite.scala:69)\n[info] at org.scalatest.BeforeAndAfterEach.runTest(BeforeAndAfterEach.scala:234)\n[info] at org.scalatest.BeforeAndAfterEach.runTest$(BeforeAndAfterEach.scala:227)\n[info] at org.apache.spark.SparkFunSuite.runTest(SparkFunSuite.scala:69)\n[info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$runTests$1(AnyFunSuiteLike.scala:269)\n[info] at org.scalatest.SuperEngine.$anonfun$runTestsInBranch$1(Engine.scala:413)\n[info] at scala.collection.immutable.List.foreach(List.scala:333)\n[info] at org.scalatest.SuperEngine.traverseSubNodes$1(Engine.scala:401)\n[info] at org.scalatest.SuperEngine.runTestsInBranch(Engine.scala:396)\n[info] at org.scalatest.SuperEngine.runTestsImpl(Engine.scala:475)\n[info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests(AnyFunSuiteLike.scala:269)\n[info] at org.scalatest.funsuite.AnyFunSuiteLike.runTests$(AnyFunSuiteLike.scala:268)\n[info] at org.scalatest.funsuite.AnyFunSuite.runTests(AnyFunSuite.scala:1564)\n[info] at org.scalatest.Suite.run(Suite.scala:1114)\n[info] at org.scalatest.Suite.run$(Suite.scala:1096)\n[info] at org.scalatest.funsuite.AnyFunSuite.org$scalatest$funsuite$AnyFunSuiteLike$$super$run(AnyFunSuite.scala:1564)\n[info] at org.scalatest.funsuite.AnyFunSuiteLike.$anonfun$run$1(AnyFunSuiteLike.scala:273)\n[info] at org.scalatest.SuperEngine.runImpl(Engine.scala:535)\n[info] at org.scalatest.funsuite.AnyFunSuiteLike.run(AnyFunSuiteLike.scala:273)\n[info] at org.scalatest.funsuite.AnyFunSuiteLike.run$(AnyFunSuiteLike.scala:272)\n[info] at org.apache.spark.SparkFunSuite.org$scalatest$BeforeAndAfterAll$$super$run(SparkFunSuite.scala:69)\n[info] at org.scalatest.BeforeAndAfterAll.liftedTree1$1(BeforeAndAfterAll.scala:213)\n[info] at org.scalatest.BeforeAndAfterAll.run(BeforeAndAfterAll.scala:210)\n[info] at org.scalatest.BeforeAndAfterAll.run$(BeforeAndAfterAll.scala:208)\n[info] at org.apache.spark.SparkFunSuite.run(SparkFunSuite.scala:69)\n[info] at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:321)\n[info] at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:517)\n[info] at sbt.ForkMain$Run.lambda$runTest$1(ForkMain.java:414)\n[info] at java.util.concurrent.FutureTask.run(FutureTask.java:266)\n[info] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)\n[info] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)\n[info] at java.lang.Thread.run(Thread.java:750)\n[info] Cause: java.lang.StackOverflowError:\n[info] at org.apache.spark.sql.connect.client.WrappedCloseableIterator.hasNext(CloseableIterator.scala:36)\n[info] at org.apache.spark.sql.connect.client.WrappedCloseableIterator.hasNext(CloseableIterator.scala:36)\n[info] at org.apache.spark.sql.connect.client.WrappedCloseableIterator.hasNext(CloseableIterator.scala:36)\n[info] at org.apache.spark.sql.connect.client.WrappedCloseableIterator.hasNext(CloseableIterator.scala:36)\n[info] at org.apache.spark.sql.connect.client.WrappedCloseableIterator.hasNext(CloseableIterator.scala:36)\n[info] at org.apache.spark.sql.connect.client.WrappedCloseableIterator.hasNext(CloseableIterator.scala:36)\n[info] at org.apache.spark.sql.connect.client.WrappedCloseableIterator.hasNext(CloseableIterator.scala:36)\n[info] at org.apache.spark.sql.connect.client.WrappedCloseableIterator.hasNext(CloseableIterator.scala:36)\n[info] at org.apache.spark.sql.connect.client.WrappedCloseableIterator.hasNext(CloseableIterator.scala:36)\n[info] at org.apache.spark.sql.connect.client.WrappedCloseableIterator.hasNext(CloseableIterator.scala:36)\n[info] at org.apache.spark.sql.connect.client.WrappedCloseableIterator.hasNext(CloseableIterator.scala:36)\n[info] at org.apache.spark.sql.connect.client.WrappedCloseableIterator.hasNext(CloseableIterator.scala:36)\n[info] at org.apache.spark.sql.connect.client.WrappedCloseableIterator.hasNext(CloseableIterator.scala:36)\n[info] at org.apache.spark.sql.connect.client.WrappedCloseableIterator.hasNext(CloseableIterator.scala:36)\n[info] at org.apache.spark.sql.connect.client.WrappedCloseableIterator.hasNext(CloseableIterator.scala:36)\n[info] at org.apache.spark.sql.connect.client.WrappedCloseableIterator.hasNext(CloseableIterator.scala:36)\n...\n[info] - client releases responses directly after consuming them (236 milliseconds)\n[info] - server releases responses automatically when client moves ahead (336 milliseconds)\n[info] - big query (863 milliseconds)\n[info] - big query and slow client (7 seconds, 14 milliseconds)\n[info] - big query with frequent reattach (735 milliseconds)\n[info] - big query with frequent reattach and slow client (7 seconds, 606 milliseconds)\n[info] - long sleeping query (10 seconds, 156 milliseconds)\n[info] Run completed in 34 seconds, 522 milliseconds.\n[info] Total number of tests run: 13\n[info] Suites: completed 1, aborted 0\n[info] Tests: succeeded 12, failed 1, canceled 0, ignored 0, pending 0\n[info] *** 1 TEST FAILED ***\n[error] Failed tests:\n[error] \torg.apache.spark.sql.connect.execution.ReattachableExecuteSuite\n```\n\n**After**\n\n```\n[info] ReattachableExecuteSuite:\n[info] - reattach after initial RPC ends (2 seconds, 134 milliseconds)\n[info] - raw interrupted RPC results in INVALID_CURSOR.DISCONNECTED error (26 milliseconds)\n[info] - raw new RPC interrupts previous RPC with INVALID_CURSOR.DISCONNECTED error (19 milliseconds)\n[info] - client INVALID_CURSOR.DISCONNECTED error is retried when rpc sender gets interrupted (328 milliseconds)\n[info] - client INVALID_CURSOR.DISCONNECTED error is retried when other RPC preempts this one (562 milliseconds)\n[info] - abandoned query gets INVALID_HANDLE.OPERATION_ABANDONED error (46 milliseconds)\n[info] - client releases responses directly after consuming them (231 milliseconds)\n[info] - server releases responses automatically when client moves ahead (359 milliseconds)\n[info] - big query (978 milliseconds)\n[info] - big query and slow client (7 seconds, 50 milliseconds)\n[info] - big query with frequent reattach (703 milliseconds)\n[info] - big query with frequent reattach and slow client (7 seconds, 626 milliseconds)\n[info] - long sleeping query (10 seconds, 141 milliseconds)\n[info] Run completed in 33 seconds, 844 milliseconds.\n[info] Total number of tests run: 13\n[info] Suites: completed 1, aborted 0\n[info] Tests: succeeded 13, failed 0, canceled 0, ignored 0, pending 0\n[info] All tests passed.\n```\n### Was this patch authored or co-authored using generative AI tooling?\nNo\n\nCloses #42981 from LuciferYang/CloseableIterator-apply.\n\nAuthored-by: yangjie01 \nSigned-off-by: yangjie01 ","shortMessageHtmlLink":"[SPARK-45211][CONNECT] Eliminated ambiguous references in `CloseableI…"}},{"before":null,"after":"a02b1f70ce8407412bcf5f5c97187cf980609b7b","ref":"refs/heads/test-3.3","pushedAt":"2023-08-04T06:28:06.000Z","pushType":"branch_creation","commitsCount":0,"pusher":{"login":"ejblanco","name":"Eric Blanco","path":"/ejblanco","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/2387705?s=80&v=4"},"commit":{"message":"executor def","shortMessageHtmlLink":"executor def"}},{"before":null,"after":"142b8953cc94281b795e2253475f1b5cd5abb227","ref":"refs/heads/spark-3.4","pushedAt":"2023-05-31T13:08:36.371Z","pushType":"branch_creation","commitsCount":0,"pusher":{"login":"ejblanco","name":"Eric Blanco","path":"/ejblanco","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/2387705?s=80&v=4"},"commit":{"message":"feat: spark 3.4.0","shortMessageHtmlLink":"feat: spark 3.4.0"}},{"before":"a654ed7fe0790db37a75fd1ad6b88aedd7ed7b07","after":"4dc3203ef6c1a8ec8ed01839f7e1ecd80d94780a","ref":"refs/heads/spark-3.1","pushedAt":"2023-03-15T12:47:45.801Z","pushType":"push","commitsCount":1,"pusher":{"login":"ejblanco","name":"Eric Blanco","path":"/ejblanco","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/2387705?s=80&v=4"},"commit":{"message":"change val configMapNameDriver to def","shortMessageHtmlLink":"change val configMapNameDriver to def"}},{"before":"e8d81c3d2e8d33fce25027a03ce0f5916d5808ba","after":"f11cbc67b7717e808221d7062fed95eee8cb548a","ref":"refs/heads/spark-3.3","pushedAt":"2023-03-14T11:14:56.334Z","pushType":"push","commitsCount":1,"pusher":{"login":"ejblanco","name":"Eric Blanco","path":"/ejblanco","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/2387705?s=80&v=4"},"commit":{"message":"change exec to val","shortMessageHtmlLink":"change exec to val"}},{"before":"9f928d4de2beea78268e40b217a1a850b359fce5","after":"a654ed7fe0790db37a75fd1ad6b88aedd7ed7b07","ref":"refs/heads/spark-3.1","pushedAt":"2023-03-10T08:46:40.688Z","pushType":"push","commitsCount":1,"pusher":{"login":"ejblanco","name":"Eric Blanco","path":"/ejblanco","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/2387705?s=80&v=4"},"commit":{"message":"fix","shortMessageHtmlLink":"fix"}},{"before":"32e194b8b5a231fdad8513b6094c8aad5a085295","after":"9f928d4de2beea78268e40b217a1a850b359fce5","ref":"refs/heads/spark-3.1","pushedAt":"2023-03-10T08:45:52.785Z","pushType":"push","commitsCount":1,"pusher":{"login":"ejblanco","name":"Eric Blanco","path":"/ejblanco","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/2387705?s=80&v=4"},"commit":{"message":"fix","shortMessageHtmlLink":"fix"}},{"before":"1af90eaa3802fae90376aaa65c255478ff0b2459","after":"32e194b8b5a231fdad8513b6094c8aad5a085295","ref":"refs/heads/spark-3.1","pushedAt":"2023-03-09T13:48:30.420Z","pushType":"push","commitsCount":1,"pusher":{"login":"ejblanco","name":"Eric Blanco","path":"/ejblanco","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/2387705?s=80&v=4"},"commit":{"message":"retry driver and executor creation","shortMessageHtmlLink":"retry driver and executor creation"}},{"before":"3f690c14f9cce2ac777f227f8765a825bf5a6fb7","after":"e8d81c3d2e8d33fce25027a03ce0f5916d5808ba","ref":"refs/heads/spark-3.3","pushedAt":"2023-03-09T11:58:12.470Z","pushType":"push","commitsCount":1,"pusher":{"login":"ejblanco","name":"Eric Blanco","path":"/ejblanco","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/2387705?s=80&v=4"},"commit":{"message":"finally working","shortMessageHtmlLink":"finally working"}},{"before":"fdecfec839f03f3879f93c2356f9b47a116e997d","after":"3f690c14f9cce2ac777f227f8765a825bf5a6fb7","ref":"refs/heads/spark-3.3","pushedAt":"2023-03-08T16:21:09.462Z","pushType":"push","commitsCount":1,"pusher":{"login":"ejblanco","name":"Eric Blanco","path":"/ejblanco","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/2387705?s=80&v=4"},"commit":{"message":"fix again","shortMessageHtmlLink":"fix again"}},{"before":"8bb9fdcf2fe4c9c8a410ffc46964a7dacfa66e7b","after":"fdecfec839f03f3879f93c2356f9b47a116e997d","ref":"refs/heads/spark-3.3","pushedAt":"2023-03-08T15:28:10.441Z","pushType":"push","commitsCount":1,"pusher":{"login":"ejblanco","name":"Eric Blanco","path":"/ejblanco","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/2387705?s=80&v=4"},"commit":{"message":"fix name","shortMessageHtmlLink":"fix name"}},{"before":"337160048253afd771782d202b625c20bc90fbeb","after":"8bb9fdcf2fe4c9c8a410ffc46964a7dacfa66e7b","ref":"refs/heads/spark-3.3","pushedAt":"2023-03-08T14:02:45.021Z","pushType":"push","commitsCount":87,"pusher":{"login":"ejblanco","name":"Eric Blanco","path":"/ejblanco","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/2387705?s=80&v=4"},"commit":{"message":"chore: add retry for executor and driver","shortMessageHtmlLink":"chore: add retry for executor and driver"}}],"hasNextPage":false,"hasPreviousPage":false,"activityType":"all","actor":null,"timePeriod":"all","sort":"DESC","perPage":30,"cursor":"Y3Vyc29yOnYyOpK7MjAyMy0wOS0xOVQxNDoxOTo0Mi4wMDAwMDBazwAAAAOD-urc","startCursor":"Y3Vyc29yOnYyOpK7MjAyMy0wOS0xOVQxNDoxOTo0Mi4wMDAwMDBazwAAAAOD-urc","endCursor":"Y3Vyc29yOnYyOpK7MjAyMy0wMy0wOFQxNDowMjo0NS4wMjEzNjBazwAAAAL_D3QV"}},"title":"Activity · Telefonica/spark"}