In this tutorial, we will look at how to filter data in a Pyspark dataframe with the help of some examples.
How to filter data in a Pyspark dataframe?

You can use the Pyspark dataframe filter()
function to filter the data in the dataframe based on your desired criteria. The following is the syntax –
# df is a pyspark dataframe df.filter(filter_expression)
It takes a condition or expression as a parameter and returns the filtered dataframe.
Examples
Let’s look at the usage of the Pyspark filter()
function with the help of some examples. First, we’ll create a Pyspark dataframe that we’ll be using throughout this tutorial.
#import the pyspark module import pyspark # import the sparksession class from pyspark.sql from pyspark.sql import SparkSession # create an app from SparkSession class spark = SparkSession.builder.appName('datascience_parichay').getOrCreate() # books data as list of lists df = [[1, "PHP", "Sravan", 250], [2, "SQL", "Chandra", 300], [3, "Python", "Harsha", 250], [4, "R", "Rohith", 1200], [5, "Hadoop", "Manasa", 700], ] # creating dataframe from books data dataframe = spark.createDataFrame(df, ['Book_Id', 'Book_Name', 'Author', 'Price']) # display the dataframe dataframe.show()
Output:
Highlighted programs for you
Flatiron School
Flatiron School
University of Maryland Global Campus
University of Maryland Global Campus
Creighton University
Creighton University
+-------+---------+-------+-----+ |Book_Id|Book_Name| Author|Price| +-------+---------+-------+-----+ | 1| PHP| Sravan| 250| | 2| SQL|Chandra| 300| | 3| Python| Harsha| 250| | 4| R| Rohith| 1200| | 5| Hadoop| Manasa| 700| +-------+---------+-------+-----+
We now have a dataframe containing 5 rows and 4 columns with information about different books. Let’s now look at some ways you can filter the data.
Filter data with relational operators in Pyspark
Use relational operators (for example, <, >, <=, >=, ==, !=, etc.) to create your expression resulting in a boolean outcome and pass it as an argument to the filter()
function.
Let’s filter the above dataframe such that we get all the books that have a price of less than 500.
# filter for Price < 500 dataframe.filter(dataframe["Price"] < 500).show()
Output:
+-------+---------+-------+-----+ |Book_Id|Book_Name| Author|Price| +-------+---------+-------+-----+ | 1| PHP| Sravan| 250| | 2| SQL|Chandra| 300| | 3| Python| Harsha| 250| +-------+---------+-------+-----+
You can see that the resulting dataframe has only books priced less than 500.
Filter data on a list of values
We can use the filter()
function in combination with the isin()
function to filter a dataframe based on a list of values. For example, let’s get the data on books written by a specified list of writers, for example, ['Manasa', 'Rohith']
.
# filter data based on list values ls = ['Manasa','Rohith'] dataframe.filter(dataframe["Author"].isin(ls)).show()
Output:
+-------+---------+------+-----+ |Book_Id|Book_Name|Author|Price| +-------+---------+------+-----+ | 4| R|Rohith| 1200| | 5| Hadoop|Manasa| 700| +-------+---------+------+-----+
You can see that we get data filtered by values in the list of authors used.
Filter dataframe with string functions
You can also use string functions (on columns with string data) to filter a Pyspark dataframe. For example, you can use the string startswith()
function to filter for records in a column starting with some specific string.
Let’s look at some examples.
# filter data for author name starting with R print(dataframe.filter(dataframe["Author"].startswith("R")).show()) # filter data for author name ending with h print(dataframe.filter(dataframe["Author"].endswith("h")).show())
Output:
+-------+---------+------+-----+ |Book_Id|Book_Name|Author|Price| +-------+---------+------+-----+ | 4| R|Rohith| 1200| +-------+---------+------+-----+ None +-------+---------+------+-----+ |Book_Id|Book_Name|Author|Price| +-------+---------+------+-----+ | 4| R|Rohith| 1200| +-------+---------+------+-----+ None
Here, we filter the dataframe with author names starting with “R” and in the following code filter the dataframe with author names ending with “h”.
In this tutorial, we looked at how to use the filter()
function in Pyspark to filter a Pyspark dataframe. You can also use the Pyspark where() function to similarly filter a Pyspark dataframe.
You might also be interested in –
- Aggregate Functions in PySpark
- Get DataFrame Records with Pyspark collect()
- Display DataFrame in Pyspark with show()
Subscribe to our newsletter for more informative guides and tutorials.
We do not spam and you can opt out any time.