pyspark : NameError: name ‘spark’ is not defined

This is because there is no default in Python program pyspark.sql.session . sparksession , so we just need to import the relevant modules and then convert them to sparksession .
Related codes:

from pyspark.context import SparkContext
from pyspark.sql.session import SparkSession
sc = SparkContext('local')
spark = SparkSession(sc)
print(type(spark))

Print out its type and solve it successfully:
& lt; class' pyspark.sql.session .SparkSession'>

Read More: