Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

pyspark program throwing name 'spark' is not defined

Below program throwing error name 'spark' is not defined

Traceback (most recent call last):  
File "pgm_latest.py", line 232, in <module>
    sconf =SparkConf().set(spark.dynamicAllocation.enabled,true)       
        .set(spark.dynamicAllocation.maxExecutors,300)        
        .set(spark.shuffle.service.enabled,true)       
        .set(spark.shuffle.spill.compress,true)
NameError: name 'spark' is not defined 
spark-submit --driver-memory 12g --master yarn-cluster --executor-memory 6g --executor-cores 3 pgm_latest.py

Code

#!/usr/bin/python
import sys
import os
from datetime import *
from time import *
from pyspark.sql import *
from pyspark
import SparkContext
from pyspark import SparkConf

sc = SparkContext()
sqlCtx= HiveContext(sc)

sqlCtx.sql('SET spark.sql.autoBroadcastJoinThreshold=104857600')
sqlCtx.sql('SET Tungsten=true')
sqlCtx.sql('SET spark.sql.shuffle.partitions=500')
sqlCtx.sql('SET spark.sql.inMemoryColumnarStorage.compressed=true')
sqlCtx.sql('SET spark.sql.inMemoryColumnarStorage.batchSize=12000')
sqlCtx.sql('SET spark.sql.parquet.cacheMetadata=true')
sqlCtx.sql('SET spark.sql.parquet.filterPushdown=true')
sqlCtx.sql('SET spark.sql.hive.convertMetastoreParquet=true')
sqlCtx.sql('SET spark.sql.parquet.binaryAsString=true')
sqlCtx.sql('SET spark.sql.parquet.compression.codec=snappy')
sqlCtx.sql('SET spark.sql.hive.convertMetastoreParquet=true')

## Main functionality
def main(sc):

    if name == 'main':

        # Configure OPTIONS
        sconf =SparkConf() \
            .set("spark.dynamicAllocation.enabled","true")\
            .set("spark.dynamicAllocation.maxExecutors",300)\
            .set("spark.shuffle.service.enabled","true")\
            .set("spark.shuffle.spill.compress","true")

sc =SparkContext(conf=sconf)

# Execute Main functionality

main(sc)
sc.stop()
like image 913
Satish Kumar Reddy Avatar asked Mar 02 '26 20:03

Satish Kumar Reddy


1 Answers

I think you are using old spark version than 2.x.

instead of this

spark.createDataFrame(..)

use below

> df = sqlContext.createDataFrame(...)
like image 76
Beyhan Gul Avatar answered Mar 04 '26 08:03

Beyhan Gul