search
Search
Login
Unlock 100+ guides
menu
menu
web
search toc
close
Comments
Log in or sign up
Cancel
Post
account_circle
Profile
exit_to_app
Sign out
What does this mean?
Why is this true?
Give me some examples!
search
keyboard_voice
close
Searching Tips
Search for a recipe:
"Creating a table in MySQL"
Search for an API documentation: "@append"
Search for code: "!dataframe"
Apply a tag filter: "#python"
Useful Shortcuts
/ to open search panel
Esc to close search panel
to navigate between search results
d to clear all current filters
Enter to expand content preview
icon_star
Doc Search
icon_star
Code Search Beta
SORRY NOTHING FOUND!
mic
Start speaking...
Voice search is only supported in Safari and Chrome.
Navigate to

PySpark Column | withField method

schedule Aug 12, 2023
Last updated
local_offer
PySpark
Tags
mode_heat
Master the mathematics behind data science with 100+ top-tier guides
Start your free 7-days trial now!

PySpark Column's withField(~) method is used to either add or update a nested field value.

Parameters

1. fieldName | string

The name of the nested field.

2. col | Column

The new column value to add or update with.

Return Value

A PySpark Column (pyspark.sql.column.Column).

Examples

Consider the following PySpark DataFrame with nested rows:

from pyspark.sql import Row
data = [
Row(name="Alex", age=20, friend=Row(name="Bob",age=30)),
Row(name="Cathy", age=40, friend=Row(name="Doge",age=40))
]
df = spark.createDataFrame(data)
df.show()
+-----+---+----------+
| name|age| friend|
+-----+---+----------+
| Alex| 20| {Bob, 30}|
|Cathy| 40|{Doge, 40}|
+-----+---+----------+

Here, the friend column contains nested Row, which can be confirmed by printing out the schema:

root
|-- name: string (nullable = true)
|-- age: long (nullable = true)
|-- friend: struct (nullable = true)
| |-- name: string (nullable = true)
| |-- age: long (nullable = true)

Updating nested rows in PySpark

To update nested rows, use the withField(~) method like so:

import pyspark.sql.functions as F
updated_col = df["friend"].withField("name", F.lit("BOB"))
df.withColumn("friend", updated_col).show()
+-----+---+---------+
| name|age| friend|
+-----+---+---------+
| Alex| 20|{BOB, 30}|
|Cathy| 40|{BOB, 40}|
+-----+---+---------+

Note the following:

  • we are updating the name field of the friend column with a constant string "BOB".

  • the F.lit("BOB") returns a Column object whose values are filled with the string "BOB".

  • the withColumn(~) method replaces the friend column of our DataFrame with the updated column returned by withField(~).

Updating nested rows using original values in PySpark

To update nested rows using original values, use the withField(~) method like so:

updated_col = df["friend"].withField("name", F.upper("friend.name"))
df.withColumn("friend", updated_col).show()
+-----+---+----------+
| name|age| friend|
+-----+---+----------+
| Alex| 20| {BOB, 30}|
|Cathy| 40|{DOGE, 40}|
+-----+---+----------+

Here, we are uppercasing the name field of the friend column using F.upper("friend.name"), which returns a Column object.

Adding new field values in nested rows in PySpark

The withField(~) column can also be used to add new field values in nested rows:

updated_col = df["friend"].withField("upper_name", F.upper("friend.name"))
df_new = df.withColumn("friend", updated_col)
df_new.show()
+-----+---+----------------+
| name|age| friend|
+-----+---+----------------+
| Alex| 20| {Bob, 30, BOB}|
|Cathy| 40|{Doge, 40, DOGE}|
+-----+---+----------------+

Now, checking our schema of our new PySpark DataFrame:

df_new.printSchema()
root
|-- name: string (nullable = true)
|-- age: long (nullable = true)
|-- friend: struct (nullable = true)
| |-- name: string (nullable = true)
| |-- age: long (nullable = true)
| |-- upper_name: string (nullable = true)

We can see the new nested field upper_name has been added!

robocat
Published by Isshin Inada
Edited by 0 others
Did you find this page useful?
thumb_up
thumb_down
Comment
Citation
Ask a question or leave a feedback...
thumb_up
3
thumb_down
0
chat_bubble_outline
4
settings
Enjoy our search
Hit / to insta-search docs and recipes!