admin管理员组文章数量:1418922
I am trying to write to a iceberg table(which does not exist before write, hence creating during write) and would like to provide few table properties. Is it possible to do so using the dataframeWriter? I do not want to fire a sql query using spark.sql()
the following are some of the configs that I am using.
"spark.sql.catalog.spark_catalog": ".apache.iceberg.spark.SparkSessionCatalog"
"spark.sql.extensions": ".apache.iceberg.spark.extensions.IcebergSparkSessionExtensions"
"spark.sql.catalogImplementation": "hive"
I am trying to write to a iceberg table(which does not exist before write, hence creating during write) and would like to provide few table properties. Is it possible to do so using the dataframeWriter? I do not want to fire a sql query using spark.sql()
the following are some of the configs that I am using.
"spark.sql.catalog.spark_catalog": ".apache.iceberg.spark.SparkSessionCatalog"
"spark.sql.extensions": ".apache.iceberg.spark.extensions.IcebergSparkSessionExtensions"
"spark.sql.catalogImplementation": "hive"
Share Improve this question asked Jan 29 at 12:06 trutru 1632 silver badges11 bronze badges 3 |1 Answer
Reset to default 1Using DataFrameWriterV2, this is possible:
spark.range(10).withColumn("tmp", lit("hi")).writeTo("test.sample").using("iceberg").tableProperty("write.spark.accept-any-schema", "true").createOrReplace()
本文标签: pysparkProviding table options while writing to iceberg table using sparkStack Overflow
版权声明:本文标题:pyspark - Providing table options while writing to iceberg table using spark - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1745299523a2652286.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
.option()
are listed in Iceberg docs: iceberg.apache./docs/latest/spark-configuration/… – mazaneicha Commented Mar 21 at 21:49