Pyspark:错误 – 在向驱动程序发送端口号之前退出Java网关进程
|
我收到此错误:异常:当我尝试在Pyspark中实例化Spark会话时,在向驱动程序发送端口号之前退出
Java网关进程.这是代码
from pyspark import SparkConf
from pyspark.sql import SparkSession
if __name__ == '__main__':
SPARK_CONFIGURATION = SparkConf().setAppName("OPL").setMaster("local[*]")
SPARK_SESSION = SparkSession.builder
.config(conf=SPARK_CONFIGURATION)
.getOrCreate()
print("Hello world")
这是追溯 Neon was unexpected at this time.
Traceback (most recent call last):
File "C:UsersIBM_ADMINDocumentsEclipse Neon for Liberty on BluemixOPL_InterfacesrcTestSparkTest.py",line 12,in <module>
.config(conf=SPARK_CONFIGURATION)
File "C:UsersIBM_ADMINDocumentsspark-2.1.0-bin-hadoop2.7pythonpysparksqlsession.py",line 169,in getOrCreate
sc = SparkContext.getOrCreate(sparkConf)
File "C:UsersIBM_ADMINDocumentsspark-2.1.0-bin-hadoop2.7pythonpysparkcontext.py",line 307,in getOrCreate
SparkContext(conf=conf or SparkConf())
File "C:UsersIBM_ADMINDocumentsspark-2.1.0-bin-hadoop2.7pythonpysparkcontext.py",line 115,in __init__
SparkContext._ensure_initialized(self,gateway=gateway,conf=conf)
File "C:UsersIBM_ADMINDocumentsspark-2.1.0-bin-hadoop2.7pythonpysparkcontext.py",line 256,in _ensure_initialized
SparkContext._gateway = gateway or launch_gateway(conf)
File "C:UsersIBM_ADMINDocumentsspark-2.1.0-bin-hadoop2.7pythonpysparkjava_gateway.py",line 95,in launch_gateway
raise Exception("Java gateway process exited before sending the driver its port number")
Exception: Java gateway process exited before sending the driver its port number
我在Eclipse Eclipse Neon.2 Release(4.6.2)中使用了PyDev.这是配置: 注意:我使用的是最新的Spark版本:spark-2.1.0-bin-hadoop2.7 我检查了其他几个条目Pyspark: Exception: Java gateway process exited before sending the driver its port number Spark + Python – Java gateway process exited before sending the driver its port number? 这是Pyspark的一个错误吗? 解决方法
我的同事和我也都遇到了同样的问题,这阻碍了我们,让我们把头发拉了一会儿.我们尝试了一堆建议(Java路径中没有空格,设置/取消设置PYSPARK_SUBMIT_ARGS env var,…)都无济于事.
为我们修复它的事情是切换到Spark 2.3.1.我们尝试使用2.2.1和2.3.0. 希望这有助于拯救一些人一点点拉毛. (编辑:李大同) 【声明】本站内容均来自网络,其相关言论仅代表作者个人观点,不代表本站立场。若无意侵犯到您的权利,请及时与联系站长删除相关内容! |
