filter pyspark dataframe

Filter PySpark DataFrame with where()

In this tutorial, we will look at how to use the Pyspark where() function to filter a Pyspark dataframe with the help of some examples.

How to filter dataframe in Pyspark?

filter pyspark dataframe

You can use the Pyspark where() method to filter data in a Pyspark dataframe. You can use relational operators, SQL expressions, string functions, lists, etc. you filter your dataframe with the where() function.

The following is the syntax –

# dataframe is your pyspark dataframe
dataframe.where()

It takes the filter expression/condition as an argument and returns the filtered data.

Examples

Let’s look at some examples of filtering data in a Pyspark dataframe using the where() function. First, let’s create a sample Pyspark dataframe that we will be using throughout this tutorial.

#import the pyspark module
import pyspark
  
# import the  sparksession class  from pyspark.sql
from pyspark.sql import SparkSession

# create an app from SparkSession class
spark = SparkSession.builder.appName('datascience_parichay').getOrCreate()

# books data as list of lists
df = [[1, "PHP", "Sravan", 250],
        [2, "SQL", "Chandra", 300],
        [3, "Python", "Harsha", 250],
        [4, "R", "Rohith", 1200],
        [5, "Hadoop", "Manasa", 700],
        ]
  
# creating dataframe from books data
dataframe = spark.createDataFrame(df, ['Book_Id', 'Book_Name', 'Author', 'Price'])

# display the dataframe
dataframe.show()

Output:

+-------+---------+-------+-----+
|Book_Id|Book_Name| Author|Price|
+-------+---------+-------+-----+
|      1|      PHP| Sravan|  250|
|      2|      SQL|Chandra|  300|
|      3|   Python| Harsha|  250|
|      4|        R| Rohith| 1200|
|      5|   Hadoop| Manasa|  700|
+-------+---------+-------+-----+

We now have a dataframe containing 5 rows and 4 columns with information about different books.

Filter data using relational operators

You can pass expressions containing relational operators (for example, <, >, ==, <=, >=, etc.) to the where() function. Let’s filter the above data such that we only have data for books with “Book_Id” greater than 2.

📚 Data Science Programs By Skill Level

Introductory

Intermediate ⭐⭐⭐

Advanced ⭐⭐⭐⭐⭐

🔎 Find Data Science Programs 👨‍💻 111,889 already enrolled

Disclaimer: Data Science Parichay is reader supported. When you purchase a course through a link on this site, we may earn a small commission at no additional cost to you. Earned commissions help support this website and its team of writers.

# filter the book data for Book_Id > 2
dataframe.where(dataframe.Book_Id > 2).show()

Output:

+-------+---------+------+-----+
|Book_Id|Book_Name|Author|Price|
+-------+---------+------+-----+
|      3|   Python|Harsha|  250|
|      4|        R|Rohith| 1200|
|      5|   Hadoop|Manasa|  700|
+-------+---------+------+-----+

We get dataframe rows satisfying the condition defined in the expression, books with “Book_Id” > 2.

Let’s look at another example.

Let’s filter for the book with the exact title “R”. For this, we will be using the equality operator.

# filter book data for Book_Name "R"
dataframe.where(dataframe.Book_Name == 'R').show()

Output:

+-------+---------+------+-----+
|Book_Id|Book_Name|Author|Price|
+-------+---------+------+-----+
|      4|        R|Rohith| 1200|
+-------+---------+------+-----+

We only get data about the book with the title “R”.

Filter dataframe on list of values

We can use the where() function in combination with the isin() function to filter dataframe based on a list of values. For example, let’s get the book data on books written by a specified list of writers, for example, ['Manasa', 'Rohith'].

# filter data based on list values 
ls = ['Manasa','Rohith'] 
dataframe.where(dataframe.Author.isin(ls)).show()

Output:

+-------+---------+------+-----+
|Book_Id|Book_Name|Author|Price|
+-------+---------+------+-----+
|      4|        R|Rohith| 1200|
|      5|   Hadoop|Manasa|  700|
+-------+---------+------+-----+

You can see that we get data filtered by values in the list of authors used.

Filter dataframe with string functions

You can also use string functions (on columns with string data) to filter a Pyspark dataframe. For example, you can use the string startswith() function to filter for records in a column starting with some specific string.

Let’s look at some examples.

# filter data for author name starting with R 
print(dataframe.where(dataframe.Author.startswith("R")).show()) 

# filter data for author name ending with h 
print(dataframe.where(dataframe.Author.endswith("h")).show()) 

Output:

+-------+---------+------+-----+
|Book_Id|Book_Name|Author|Price|
+-------+---------+------+-----+
|      4|        R|Rohith| 1200|
+-------+---------+------+-----+

None
+-------+---------+------+-----+
|Book_Id|Book_Name|Author|Price|
+-------+---------+------+-----+
|      4|        R|Rohith| 1200|
+-------+---------+------+-----+

None

Here, we filter the dataframe with author names starting with “R” and in the following code filter the dataframe with author names ending with “h”.

You might also be interested in –


Subscribe to our newsletter for more informative guides and tutorials.
We do not spam and you can opt out any time.


Authors

  • Piyush Raj

    Piyush is a data professional passionate about using data to understand things better and make informed decisions. He has experience working as a Data Scientist in the consulting domain and holds an engineering degree from IIT Roorkee. His hobbies include watching cricket, reading, and working on side projects.

  • Gottumukkala Sravan Kumar
Scroll to Top