用pyspark创建一个SparkSession时出现了这个问题,以下为控制台报错内容:
Exception Traceback (most recent call last) in () ----> 1 spark = SparkSession.builder.appName("EstimatorTransformerParamExample").getOrCreate() /home/zhugongzaici/DS/spark-2.0.0-bin-hadoop2.7/python/pyspark/sql/session.py in getOrCreate(self) 164 session = SparkSession._instantiatedContext 165 if session is None: --> 166 sparkConf = SparkConf() 167 for key, value in self._options.items(): 168 sparkConf.set(key, value) /home/zhugongzaici/DS/spark-2.0.0-bin-hadoop2.7/python/pyspark/conf.py in __init__(self, loadDefaults, _jvm, _jconf) 102 else: 103 from pyspark.context import SparkContext --> 104 SparkContext._ensure_initialized() 105 _jvm = _jvm or SparkContext._jvm 106 self._jconf = _jvm.SparkConf(loadDefaults) /home/zhugongzaici/DS/spark-2.0.0-bin-hadoop2.7/python/pyspark/context.py in _ensure_initialized(cls, instance, gateway) 241 with SparkContext._lock: 242 if not SparkContext._gateway: --> 243 SparkContext._gateway = gateway or launch_gateway() 244 SparkContext._jvm = SparkContext._gateway.jvm 245 /home/zhugongzaici/DS/spark-2.0.0-bin-hadoop2.7/python/pyspark/java_gateway.py in launch_gateway() 92 callback_socket.close() 93 if gateway_port is None: ---> 94 raise Exception("Java gateway process exited before sending the driver its port number") 95 96 # In Windows, ensure the Java child processes do not linger after Python has exited. Exception: Java gateway process exited before sending the driver its port number
开发环境是ubuntu下的jupyter notebook(anaconda3),开发语言python3
学习是最好的投资!