Skip to content

Unsupported arguments [Array(StringArray[..]), Scalar(Int32(1)), Array(StringArray[..])] for function rpad/read_side_padding #2475

@wForget

Description

@wForget

Describe the bug

Got an error on fuzz-testing: wForget/fuzz-test-spark-native#277

sql:

SELECT c27, 1, c0, rpad(c27, 1, c0) AS x FROM test1 ORDER BY c27, 1, c0;

table schema:

Created table test1 with schema:
    c0: StringType
    c27: StringType

error

[ERROR] Query failed in native engine comet: Job aborted due to stage failure: Task 0 in stage 1318.0 failed 1 times, most recent failure: Lost task 0.0 in stage 1318.0 (TID 915) (runnervm3ublj executor driver): org.apache.comet.CometNativeException: Unsupported arguments [Array(StringArray
[
  "8565193544385836345",
  "慄暗瘈뱑ᦺ㸼鶫韷",
  "103",
  null,
  "-3852769133872082214",
  "0.2587759106724863",
  "-38",
  "0.6385826953937865",
  "⿇踖呺懗㖀鑶儧閊",
  "㊆뇽跎箸旨꠲\u{30b}길",
  ...180 elements...,
  null,
  "眚厏⠏Ꮄ하밸\u{aab7}㞱",
  null,
  "光첷䑃絺㊈꺒䛢쟪",
  "-80",
  "鯾炲놼ਬꘑ婀闍讉",
  "蛒떓浀箼鷒㠎ዹ鯤",
  "ꡇᶠ歿釥꼙쪘忖롘",
  "䤲笛洒㵮官嘧ᄪ徹",
  "4",
]), Scalar(Int32(1)), Array(StringArray
[
  "㩠㝗긂쒼⎲ⱱ濥\u{a7ec}",
  "ꌙ풇렾ꑵ㠚㑘\u{1755}됩",
  "-103",
  "喜㏦驛궉凝Ἁ鴏Ȼ",
  "ラ滓⾈픻뀋牵딻䟦",
  "49",
  "122",
  "0.2134187034749193",
  null,
  null,
  ...180 elements...,
  "峙ℰ䫭䀳䧢Ίඖ슐",
  "-109",
  null,
  "123",
  "㕅캳밚⁄樉킿햇頉",
  "3439690211593789959",
  "0.6273652139162924",
  "뤷澓⎮ᨙ픶賀㶂㫆",
  "34",
  "價籌䋥款礱ⲉlj\u{abe8}",
])] for function rpad/read_side_padding.
This issue was likely caused by a bug in DataFusion's code. Please help us to resolve this by filing a bug report in our issue tracker: https://github.com/apache/datafusion/issues
	at org.apache.comet.Native.executePlan(Native Method)
	at org.apache.comet.CometExecIterator.$anonfun$getNextBatch$2(CometExecIterator.scala:173)
	at org.apache.comet.CometExecIterator.$anonfun$getNextBatch$2$adapted(CometExecIterator.scala:172)
	at org.apache.comet.vector.NativeUtil.getNextBatch(NativeUtil.scala:212)
	at org.apache.comet.CometExecIterator.$anonfun$getNextBatch$1(CometExecIterator.scala:172)
	at org.apache.comet.Tracing$.withTrace(Tracing.scala:31)
	at org.apache.comet.CometExecIterator.getNextBatch(CometExecIterator.scala:170)
	at org.apache.comet.CometExecIterator.hasNext(CometExecIterator.scala:221)
	at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:491)
	at scala.collection.Iterator$$anon$10.hasNext(Iterator.scala:460)
	at org.apache.spark.util.random.SamplingUtils$.reservoirSampleAndCount(SamplingUtils.scala:41)
	at org.apache.spark.RangePartitioner$.$anonfun$sketch$1(Partitioner.scala:322)
	at org.apache.spark.RangePartitioner$.$anonfun$sketch$1$adapted(Partitioner.scala:320)
	at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsWithIndex$2(RDD.scala:910)
	at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsWithIndex$2$adapted(RDD.scala:910)
	at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
	at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:367)
	at org.apache.spark.rdd.RDD.iterator(RDD.scala:331)
	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:93)
	at org.apache.spark.TaskContext.runTaskWithListeners(TaskContext.scala:166)
	at org.apache.spark.scheduler.Task.run(Task.scala:141)
	at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$4(Executor.scala:621)
	at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally(SparkErrorUtils.scala:64)
	at org.apache.spark.util.SparkErrorUtils.tryWithSafeFinally$(SparkErrorUtils.scala:61)
	at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:94)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:624)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
	at java.base/java.lang.Thread.run(Thread.java:840)

Driver stacktrace::

Steps to reproduce

No response

Expected behavior

No response

Additional context

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions