EDUCBA

EDUCBA

MENUMENU
  • Free Tutorials
  • Free Courses
  • Certification Courses
  • 360+ Courses All in One Bundle
  • Login
Home Data Science Data Science Tutorials Spark Tutorial Spark flatMap
Secondary Sidebar
Spark Tutorial
  • Basics
    • What is Apache Spark
    • Career in Spark
    • Spark Commands
    • How to Install Spark
    • Spark Versions
    • Apache Spark Architecture
    • Spark Tools
    • Spark Shell Commands
    • Spark Functions
    • RDD in Spark
    • Spark DataFrame
    • Spark Dataset
    • Spark Components
    • Apache Spark (Guide)
    • Spark Stages
    • Spark Streaming
    • Spark Parallelize
    • Spark Transformations
    • Spark Repartition
    • Spark Shuffle
    • Spark Parquet
    • Spark Submit
    • Spark YARN
    • SparkContext
    • Spark Cluster
    • Spark SQL Dataframe
    • Join in Spark SQL
    • What is RDD
    • Spark RDD Operations
    • Spark Broadcast
    • Spark?Executor
    • Spark flatMap
    • Spark Thrift Server
    • Spark Accumulator
    • Spark web UI
    • Spark Interview Questions
  • PySpark
    • PySpark version
    • PySpark Cheat Sheet
    • PySpark list to dataframe
    • PySpark MLlib
    • PySpark RDD
    • PySpark Write CSV
    • PySpark Orderby
    • PySpark Union DataFrame
    • PySpark apply function to column
    • PySpark Count
    • PySpark GroupBy Sum
    • PySpark AGG
    • PySpark Select Columns
    • PySpark withColumn
    • PySpark Median
    • PySpark toDF
    • PySpark partitionBy
    • PySpark join two dataframes
    • PySpark?foreach
    • PySpark when
    • PySPark Groupby
    • PySpark OrderBy Descending
    • PySpark GroupBy Count
    • PySpark Window Functions
    • PySpark Round
    • PySpark substring
    • PySpark Filter
    • PySpark Union
    • PySpark Map
    • PySpark SQL
    • PySpark Histogram
    • PySpark row
    • PySpark rename column
    • PySpark Coalesce
    • PySpark parallelize
    • PySpark read parquet
    • PySpark Join
    • PySpark Left Join
    • PySpark Alias
    • PySpark Column to List
    • PySpark structtype
    • PySpark Broadcast Join
    • PySpark Lag
    • PySpark count distinct
    • PySpark pivot
    • PySpark explode
    • PySpark Repartition
    • PySpark SQL Types
    • PySpark Logistic Regression
    • PySpark mappartitions
    • PySpark collect
    • PySpark Create DataFrame from List
    • PySpark TimeStamp
    • PySpark FlatMap
    • PySpark withColumnRenamed
    • PySpark Sort
    • PySpark to_Date
    • PySpark kmeans
    • PySpark LIKE
    • PySpark?groupby multiple columns

Related Courses

Spark Certification Course

PySpark Certification Course

Apache Storm Course

Spark flatMap

By Priya PedamkarPriya Pedamkar

Spark flatMap

Introduction to Spark flatMap

In Apache spark, Spark flatMap is one of the transformation operations. Tr operation of Map function is applied to all the elements of RDD which means Resilient Distributed Data sets. These are immutable and collection of records which are partitioned and these can only be created by operations (operations that are applied throughout all the elements of the dataset) like filter and map. The operation developer in Map has the facility to create his own custom logic business. Map() is mostly similar to flatMap() and can return only 0 or 1 and or more elements from the function map().

Syntax for flatMap in Spark:

RDD.flatMap(<transformation function>)

The transformation function from the above syntax code, for each element source of the RDD, can return multiple elements of RDD.

How Spark flatMap work?

How Spark FlatMap Works

Start Your Free Data Science Course

Hadoop, Data Science, Statistics & others

A flatMap is an operation of transformation. A new RDD is returned with its application on each element of RDD as a result. This gives many results out of it which means that we can get one, two, zero and other many elements from the flatMap operation applications. Map operation is one step behind flatMap operation technique and is mostly similar.

Example:

Spark
Scala
Java helps
Hello world
How are you doing
Debugging is fun

Code:

flatMap(a => a.split(' '))

Output:

Spark flatMap 4

flatMap operation of transformation is done from one to many.

Let us consider an example which calls lines.flatMap(a => a.split(‘ ‘)), is a flatMap which will create new files off RDD with records of 6 number as shown in the below picture as it splits the records into separate words with spaces in between them.

Splitting an RDD key value can also be done using flatMap operation transformation.

Like for the above example, if we consider mapping them with the key values, they are given with the same number key for identification of each key value pair.

All in One Data Science Bundle(360+ Courses, 50+ projects)
Python TutorialMachine LearningAWSArtificial Intelligence
TableauR ProgrammingPowerBIDeep Learning
Price
View Courses
360+ Online Courses | 50+ projects | 1500+ Hours | Verifiable Certificates | Lifetime Access
4.7 (86,294 ratings)

1. Spark
2. Scala
3. Java helps
4. Hello world
5. How are you doing
6. Debugging is fun

Code:

flatMap(a => a.split(' '))

Output:

Spark flatMap 7

One to one can also be used in flatMap also, one to zero mapping. lines.flatMap(a => None) is used in returning an empty RDD as flatMap does not help in creating a record for none values in a resulting RDD.flatMap(a => a.split(‘ ‘))

Spark
Scala
Java helps
Hello world
How are you doing
Debugging is fun

Code:

flatMap(a => None)

Output:

none

Examples of Spark flatMap

Given below are the examples mentioned:

Example #1

String to words – An example for Spark flatMap in RDD using Java.

Code:

import java.util.Arrays;
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaRDD;
import org.apache.spark.api.java.JavaSparkContext;
public class flatMapEx{
public static void main(String[] args) {
// Spark configuration is done according to below code
SparkConf sparkConf = new SparkConf().setAppName("Text Reading")
.setMaster("local[2]").set("spark.executor.memory","2g");
// A context of Spark being started
JavaSparkContext sc = new JavaSparkContext(sparkConf);
// A path for input text file is being provided
String path = "data/stringToWords/input_rdd/sample1.txt";
// A text file is read into RDD
JavaRDD<String> lines = sc.textFile(path);
JavaRDD<String> words = lines.flatMap(s -> Arrays.asList(s.split(" ")).iterator());
// RDD being collected for printing
for(String word:words.collect()){
System.out.println(word);
}
}
}

sample1 – sample1.txt:

Welcome to TutorialKart

Learn Apache Spark

Learn to work with RDD

Output:

in RDD using Java

Example #2

String to words – An example for Spark flatMap in RDD using pyp – Python.

Code:

import sys
from pyspark import SparkContext, SparkConf
if __name__ == "__main__":
#Using Spark configuration, creating a Spark context
conf = SparkConf().setAppName("Read Text to RDD - Python")
sc = SparkContext(conf=conf)
#Input text file is being read to the RDD
lines = sc.textFile("https://cdn.educba.com/home/tutorialeducba/heythere/spark_rdd/sample1.txt")
#Each line to words conversion using flatMap
words = lines.flatMap(lambda line: line.split(" "))
#A list is made from the collection of RDD
list1 = words.collect()
#Printing of the above list1
for line in list1:
print line

spark-submit is given below which is to be run for the above Python code.

Code:

~$ spark-submit flatmap-spark-rdd-exp.py

sample1 – sample1.txt:

Welcome to TutorialKart

Learn Apache Spark

Learn to work with RDD

Output:

RDD using pyp - Python

Important points to be noted about transformation in flatMap Spark:

  • Spark flatMap transformation provides flattened output.
  • Lazy evaluation is done in this transformation due to operation of Spark transformation.
  • A list, a sequence or an array is returned by this parameter.
  • Shuffling of the data is not done from one partition to another partition because of it being a narrow operation.

Conclusion

We have seen the concept of Spark flatMap operation. Spark flatMap transformation operation expresses one to many operation transformation. Which is a transformation of each element from zero to one, two, three or more than those valued elements. In the operation of a flatMap a developer can design his own business of logic custom. And the similar logic is applied to elements throughout the RDD.

Recommended Articles

This is a guide to Spark flatMap. Here we discuss how spark flatMap work? along with examples respectively. You may also have a look at the following articles to learn more –

  1. Spark Versions
  2. Spark Broadcast
  3. Spark SQL Dataframe
  4. Spark Streaming
Popular Course in this category
Apache Spark Training (3 Courses)
  3 Online Courses |  13+ Hours |  Verifiable Certificate of Completion |  Lifetime Access
4.5
Price

View Course

Related Courses

PySpark Tutorials (3 Courses)4.9
Apache Storm Training (1 Courses)4.8
0 Shares
Share
Tweet
Share
Primary Sidebar
Footer
About Us
  • Blog
  • Who is EDUCBA?
  • Sign Up
  • Live Classes
  • Corporate Training
  • Certificate from Top Institutions
  • Contact Us
  • Verifiable Certificate
  • Reviews
  • Terms and Conditions
  • Privacy Policy
  •  
Apps
  • iPhone & iPad
  • Android
Resources
  • Free Courses
  • Database Management
  • Machine Learning
  • All Tutorials
Certification Courses
  • All Courses
  • Data Science Course - All in One Bundle
  • Machine Learning Course
  • Hadoop Certification Training
  • Cloud Computing Training Course
  • R Programming Course
  • AWS Training Course
  • SAS Training Course

ISO 10004:2018 & ISO 9001:2015 Certified

© 2022 - EDUCBA. ALL RIGHTS RESERVED. THE CERTIFICATION NAMES ARE THE TRADEMARKS OF THEIR RESPECTIVE OWNERS.

EDUCBA
Free Data Science Course

SPSS, Data visualization with Python, Matplotlib Library, Seaborn Package

*Please provide your correct email id. Login details for this Free course will be emailed to you

By signing up, you agree to our Terms of Use and Privacy Policy.

EDUCBA Login

Forgot Password?

By signing up, you agree to our Terms of Use and Privacy Policy.

EDUCBA
Free Data Science Course

Hadoop, Data Science, Statistics & others

*Please provide your correct email id. Login details for this Free course will be emailed to you

By signing up, you agree to our Terms of Use and Privacy Policy.

EDUCBA

*Please provide your correct email id. Login details for this Free course will be emailed to you

By signing up, you agree to our Terms of Use and Privacy Policy.

Let’s Get Started

By signing up, you agree to our Terms of Use and Privacy Policy.

This website or its third-party tools use cookies, which are necessary to its functioning and required to achieve the purposes illustrated in the cookie policy. By closing this banner, scrolling this page, clicking a link or continuing to browse otherwise, you agree to our Privacy Policy

Loading . . .
Quiz
Question:

Answer:

Quiz Result
Total QuestionsCorrect AnswersWrong AnswersPercentage

Explore 1000+ varieties of Mock tests View more