摘要:
import org.apache.spark.SparkConfimport org.apache.spark.streaming.{Seconds, StreamingContext}import org.apache.spark.ml.classification.LogisticRegres 阅读全文
posted @ 2025-01-13 20:24
为20岁努力
阅读(7)
评论(0)
推荐(0)
摘要:
import org.apache.spark.graphx._import org.apache.spark.rdd.RDDimport org.apache.spark.{SparkConf, SparkContext} val conf = new SparkConf().setAppName 阅读全文
posted @ 2025-01-12 20:24
为20岁努力
阅读(5)
评论(0)
推荐(0)
摘要:
import org.apache.spark.ml.classification.LogisticRegressionimport org.apache.spark.sql.SparkSession val spark = SparkSession.builder.appName("MLlib E 阅读全文
posted @ 2025-01-11 20:24
为20岁努力
阅读(7)
评论(0)
推荐(0)
摘要:
import org.apache.spark.streaming.{Seconds, StreamingContext} // 初始化StreamingContextval conf = new SparkConf().setAppName("Spark Streaming").setMaster 阅读全文
posted @ 2025-01-10 20:23
为20岁努力
阅读(6)
评论(0)
推荐(0)
摘要:
import org.apache.spark.{SparkConf, SparkContext} // 初始化SparkContextval conf = new SparkConf().setAppName("Spark Basics").setMaster("local")val sc = n 阅读全文
posted @ 2025-01-09 20:23
为20岁努力
阅读(7)
评论(0)
推荐(0)
摘要:
// 定义一个类class Person(val name: String, val age: Int) // 使用集合操作val numbers = List(1, 2, 3, 4, 5)val doubled = numbers.map(_ * 2)val filtered = numbers. 阅读全文
posted @ 2025-01-08 20:22
为20岁努力
阅读(5)
评论(0)
推荐(0)
摘要:
// 打印输出println("Hello, Scala!") // 基本运算val a = 10val b = 5println(s"Sum: ${a + b}")println(s"Product: ${a * b}") // 判断奇偶数val number = scala.io.StdIn.r 阅读全文
posted @ 2025-01-07 20:22
为20岁努力
阅读(7)
评论(0)
推荐(0)
浙公网安备 33010602011771号