This is because there is no default in
, so we just need to import the relevant modules and then convert them to Python
program pyspark.sql.session . sparksession sparksession
.
Related codes:
from pyspark.context import SparkContext
from pyspark.sql.session import SparkSession
sc = SparkContext('local')
spark = SparkSession(sc)
print(type(spark))
Print out its type and solve it successfully:
& lt; class' pyspark.sql.session .SparkSession'>