Skip to content

changing spark configuration in spark/spark-defaults.conf is not changing the spark configuration #222

Open
@ankur334

Description

@ankur334

I thought to check the Hadoop catalog instead of using the Rest catalog for one of my works. I have changed the spark/spark-defaults.conf

# Iceberg extensions
spark.sql.extensions                org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions

# ---- Hadoop catalog called 'local' --------------------------------
spark.sql.catalog.local              org.apache.iceberg.spark.SparkCatalog
spark.sql.catalog.local.type         hadoop
spark.sql.catalog.local.warehouse    s3://warehouse/wh/

# Use Iceberg’s AWS SDK FileIO
spark.sql.catalog.local.io-impl      org.apache.iceberg.aws.s3.S3FileIO
spark.sql.catalog.local.s3.endpoint  http://minio:9000
spark.sql.catalog.local.s3.path-style-access true


# Set ‘local’ as the default catalog
spark.sql.defaultCatalog            local

# Spark event-log stuff (optional)
spark.eventLog.enabled              true
spark.eventLog.dir                  /home/iceberg/spark-events
spark.history.fs.logDirectory       /home/iceberg/spark-events

I have made changes and commented the rest part in docker-compose.yml file but when I running spark.sparkContext.getConf().getAll() then I am still getting

('spark.sql.extensions', 'org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions'),
('spark.sql.catalog.demo.uri', 'http://rest:8181/'),
('spark.sql.catalog.demo.type', 'rest'),
('spark.sql.catalog.demo', 'org.apache.iceberg.spark.SparkCatalog'),
('spark.sql.defaultCatalog', 'demo'),

I think there is something broken and spark default configuration is not changing.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions