python转spark demos

目录

 1. Test

2. UDFUtil

3. MathUtil


 1. Test



import org.apache.spark.sql.SparkSession
import org.apache.spark.sql.functions._
import org.junit.Test
import util.{MathUtil, UDFUtil}

/**
 * @author  
 * @version
 * @date 2020/5/5
 * @Description
 */
class PyToScala {

  val spark: SparkSession = SparkSession
    .builder()
    .appName("local-test")
    .master("local[4]")
    //.enableHiveSupport()
    .config("spark.shuffle.service.enabled", true)
    .config("spark.driver.maxResultSize", "4G")
    .config("spark.sql.parquet.writeLegacyFormat", true)

版权声明:本文为onway_goahead原创文章,遵循CC 4.0 BY-SA版权协议,转载请附上原文出处链接和本声明。