Pyspark36830;- 25509;- Postgresql38169; java.lang.ClassNotFoundException: org.postgresql.Driver

How to connect pyspark with PostgreSQL

Method 1: the PostgreSQL jar package placed in the spark2/jars path does not work. (error)

Method 2: setting spark. Conf. set in spark. Conf. set (‘spark. Jars’, ‘/ usr/HDP/3.0.1.0-187/spark2/jars/postgresql-42.2.5. Jar’) does not work. (error)

Solution

When starting pyspark, add parameters after it

pyspark –jars  / usr/hdp/3.0.1.0-187/spark2/jars/postgresql-42.2.5.jar

That’s fine

Read More: