In this tutorial, we will look at how to use the Pyspark where()
function to filter a Pyspark dataframe with the help of some examples.
How to filter dataframe in Pyspark?
You can use the Pyspark where()
method to filter data in a Pyspark dataframe. You can use relational operators, SQL expressions, string functions, lists, etc. you filter your dataframe with the where()
function.
The following is the syntax –
# dataframe is your pyspark dataframe dataframe.where()
It takes the filter expression/condition as an argument and returns the filtered data.
Examples
Let’s look at some examples of filtering data in a Pyspark dataframe using the where()
function. First, let’s create a sample Pyspark dataframe that we will be using throughout this tutorial.
#import the pyspark module import pyspark # import the sparksession class from pyspark.sql from pyspark.sql import SparkSession # create an app from SparkSession class spark = SparkSession.builder.appName('datascience_parichay').getOrCreate() # books data as list of lists df = [[1, "PHP", "Sravan", 250], [2, "SQL", "Chandra", 300], [3, "Python", "Harsha", 250], [4, "R", "Rohith", 1200], [5, "Hadoop", "Manasa", 700], ] # creating dataframe from books data dataframe = spark.createDataFrame(df, ['Book_Id', 'Book_Name', 'Author', 'Price']) # display the dataframe dataframe.show()
Output:
+-------+---------+-------+-----+ |Book_Id|Book_Name| Author|Price| +-------+---------+-------+-----+ | 1| PHP| Sravan| 250| | 2| SQL|Chandra| 300| | 3| Python| Harsha| 250| | 4| R| Rohith| 1200| | 5| Hadoop| Manasa| 700| +-------+---------+-------+-----+
We now have a dataframe containing 5 rows and 4 columns with information about different books.
Filter data using relational operators
You can pass expressions containing relational operators (for example, <, >, ==, <=, >=, etc.) to the where()
function. Let’s filter the above data such that we only have data for books with “Book_Id” greater than 2.
Introductory ⭐
- Harvard University Data Science: Learn R Basics for Data Science
- Standford University Data Science: Introduction to Machine Learning
- UC Davis Data Science: Learn SQL Basics for Data Science
- IBM Data Science: Professional Certificate in Data Science
- IBM Data Analysis: Professional Certificate in Data Analytics
- Google Data Analysis: Professional Certificate in Data Analytics
- IBM Data Science: Professional Certificate in Python Data Science
- IBM Data Engineering Fundamentals: Python Basics for Data Science
Intermediate ⭐⭐⭐
- Harvard University Learning Python for Data Science: Introduction to Data Science with Python
- Harvard University Computer Science Courses: Using Python for Research
- IBM Python Data Science: Visualizing Data with Python
- DeepLearning.AI Data Science and Machine Learning: Deep Learning Specialization
Advanced ⭐⭐⭐⭐⭐
- UC San Diego Data Science: Python for Data Science
- UC San Diego Data Science: Probability and Statistics in Data Science using Python
- Google Data Analysis: Professional Certificate in Advanced Data Analytics
- MIT Statistics and Data Science: Machine Learning with Python - from Linear Models to Deep Learning
- MIT Statistics and Data Science: MicroMasters® Program in Statistics and Data Science
🔎 Find Data Science Programs 👨💻 111,889 already enrolled
Disclaimer: Data Science Parichay is reader supported. When you purchase a course through a link on this site, we may earn a small commission at no additional cost to you. Earned commissions help support this website and its team of writers.
# filter the book data for Book_Id > 2 dataframe.where(dataframe.Book_Id > 2).show()
Output:
+-------+---------+------+-----+ |Book_Id|Book_Name|Author|Price| +-------+---------+------+-----+ | 3| Python|Harsha| 250| | 4| R|Rohith| 1200| | 5| Hadoop|Manasa| 700| +-------+---------+------+-----+
We get dataframe rows satisfying the condition defined in the expression, books with “Book_Id” > 2.
Let’s look at another example.
Let’s filter for the book with the exact title “R”. For this, we will be using the equality operator.
# filter book data for Book_Name "R" dataframe.where(dataframe.Book_Name == 'R').show()
Output:
+-------+---------+------+-----+ |Book_Id|Book_Name|Author|Price| +-------+---------+------+-----+ | 4| R|Rohith| 1200| +-------+---------+------+-----+
We only get data about the book with the title “R”.
Filter dataframe on list of values
We can use the where()
function in combination with the isin()
function to filter dataframe based on a list of values. For example, let’s get the book data on books written by a specified list of writers, for example, ['Manasa', 'Rohith']
.
# filter data based on list values ls = ['Manasa','Rohith'] dataframe.where(dataframe.Author.isin(ls)).show()
Output:
+-------+---------+------+-----+ |Book_Id|Book_Name|Author|Price| +-------+---------+------+-----+ | 4| R|Rohith| 1200| | 5| Hadoop|Manasa| 700| +-------+---------+------+-----+
You can see that we get data filtered by values in the list of authors used.
Filter dataframe with string functions
You can also use string functions (on columns with string data) to filter a Pyspark dataframe. For example, you can use the string startswith()
function to filter for records in a column starting with some specific string.
Let’s look at some examples.
# filter data for author name starting with R print(dataframe.where(dataframe.Author.startswith("R")).show()) # filter data for author name ending with h print(dataframe.where(dataframe.Author.endswith("h")).show())
Output:
+-------+---------+------+-----+ |Book_Id|Book_Name|Author|Price| +-------+---------+------+-----+ | 4| R|Rohith| 1200| +-------+---------+------+-----+ None +-------+---------+------+-----+ |Book_Id|Book_Name|Author|Price| +-------+---------+------+-----+ | 4| R|Rohith| 1200| +-------+---------+------+-----+ None
Here, we filter the dataframe with author names starting with “R” and in the following code filter the dataframe with author names ending with “h”.
You might also be interested in –
Subscribe to our newsletter for more informative guides and tutorials.
We do not spam and you can opt out any time.