Home>Article>Backend Development> Does Python have a spark library?
As you can see from the name pyspark, it is a combination of python and spark.
I believe you have already installed hadoop and spark on your computer at this time ,python3.
Spark provides a Python_Shell, pyspark, so that you can use Python to write Spark programs in an interactive way.(Recommended learning:Python video tutorial)
The core module in pyspark is SparkContext (sc for short), and the most important data carrier is RDD.RDD is like a NumPy array or a Pandas Series and can be regarded as an ordered collection of items. However, these items do not exist in the memory of the driver, but are divided into many partitions, and the data of each partition is stored in the memory of the cluster's executor.
Introducing the pyspark work module in Python
import pyspark from pyspark import SparkContext as sc from pyspark import SparkConf conf=SparkConf().setAppName("miniProject").setMaster("local[*]") sc=SparkContext.getOrCreate(conf) #任何Spark程序都是SparkContext开始的,SparkContext的初始化需要一个SparkConf对象,SparkConf包含了Spark集群配置的各种参数(比如主节点的URL)。初始化后,就可以使用SparkContext对象所包含的各种方法来创建和操作RDD和共享变量。Spark shell会自动初始化一个SparkContext(在Scala和Python下可以,但不支持Java)。 #getOrCreate表明可以视情况新建session或利用已有的session
SparkSession is a new concept introduced in Spark 2.0.
SparkSession provides users with a unified entry point to learn various functions of spark. In early versions of spark, SparkContext was the main entry point of spark. Since RDD is the main API, we create and operate RDD through sparkcontext. For every other API, we need to use a different context.
For example, for Streming, we need to use StreamingContext; for sql, use sqlContext; for hive, use hiveContext. But as the APIs of DataSet and DataFrame gradually become standard APIs, access points need to be established for them. Therefore, in spark2.0, SparkSession is introduced as the entry point of DataSet and DataFrame API.
SparkSession is essentially a combination of SQLContext and HiveContext (StreamingContext may be added in the future), so the APIs available on SQLContext and HiveContext can also be used on SparkSession. SparkSession encapsulates SparkContext internally, so the calculation is actually completed by SparkContext.
For more Python related technical articles, please visit thePython Tutorialcolumn to learn!
The above is the detailed content of Does Python have a spark library?. For more information, please follow other related articles on the PHP Chinese website!