Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

convert string to BigInt dataframe spark scala

I am trying to insert values into dataframe in which fields are string type into postgresql database in which field are big int type.

I didn't find how to cast them as big int.I used before IntegerType I got no problem. But with this dataframe the cast cause me negative integer

val sparkSession = SparkSession.builder.master("local").appName("spark session example").getOrCreate()

  val cabArticleGold = sparkSession.sqlContext.load("jdbc", Map("url" -> "jdbc:oracle:thin:System/maher@//localhost:1521/XE", "dbtable" -> "IPTECH.TMP_ARTCAB")).select("CODEART", "CAB").limit(10)
import sparkSession.sqlContext.implicits._
 cabArticleGold.show()
cabArticleGold.withColumn("CAB",'CAB.cast(IntegerType)).foreach(row=>println(row(1)))

232524399
-1613725482
232524423
-1613725465
232524437
-1191331072
3486
-1639094853
232524461
1564177573

Any help to use Big Int would be appreciated.I know that scala supports Big Int, but how can I do it?

like image 858
Maher HTB Avatar asked Oct 11 '25 23:10

Maher HTB


1 Answers

For large integer you should use LongType:

cabArticleGold.withColumn("CAB", 'CAB.cast(LongType))

or

cabArticleGold.withColumn("CAB", 'CAB.cast("long"))

You can also use DecimalType

cabArticleGold.withColumn("CAB", 'CAB.cast(DecimalType(38, 0)))

or

cabArticleGold.withColumn("CAB", 'CAB.cast("decimal(38, 0)"))
like image 167
Alper t. Turker Avatar answered Oct 15 '25 16:10

Alper t. Turker