Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Creating a DataFrame from Row results in 'infer schema issue'

When I began learning PySpark, I used a list to create a dataframe. Now that inferring the schema from list has been deprecated, I got a warning and it suggested me to use pyspark.sql.Row instead. However, when I try to create one using Row, I get infer schema issue. This is my code:

>>> row = Row(name='Severin', age=33)
>>> df = spark.createDataFrame(row)

This results in the following error:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/spark2-client/python/pyspark/sql/session.py", line 526, in createDataFrame
    rdd, schema = self._createFromLocal(map(prepare, data), schema)
  File "/spark2-client/python/pyspark/sql/session.py", line 390, in _createFromLocal
    struct = self._inferSchemaFromList(data)
  File "/spark2-client/python/pyspark/sql/session.py", line 322, in _inferSchemaFromList
    schema = reduce(_merge_type, map(_infer_schema, data))
  File "/spark2-client/python/pyspark/sql/types.py", line 992, in _infer_schema
    raise TypeError("Can not infer schema for type: %s" % type(row))
TypeError: Can not infer schema for type: <type 'int'>

So I created a schema

>>> schema = StructType([StructField('name', StringType()), 
...                      StructField('age',IntegerType())])
>>> df = spark.createDataFrame(row, schema)

but then, this error gets thrown.

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/spark2-client/python/pyspark/sql/session.py", line 526, in createDataFrame
    rdd, schema = self._createFromLocal(map(prepare, data), schema)
  File "/spark2-client/python/pyspark/sql/session.py", line 387, in _createFromLocal
    data = list(data)
  File "/spark2-client/python/pyspark/sql/session.py", line 509, in prepare
    verify_func(obj, schema)
  File "/spark2-client/python/pyspark/sql/types.py", line 1366, in _verify_type
    raise TypeError("StructType can not accept object %r in type %s" % (obj, type(obj)))
TypeError: StructType can not accept object 33 in type <type 'int'>
like image 926
Sivaprasanna Sethuraman Avatar asked Jan 03 '23 18:01

Sivaprasanna Sethuraman


1 Answers

The createDataFrame function takes a list of Rows (among other options) plus the schema, so the correct code would be something like:

from pyspark.sql.types import *
from pyspark.sql import Row

schema = StructType([StructField('name', StringType()), StructField('age',IntegerType())])
rows = [Row(name='Severin', age=33), Row(name='John', age=48)]
df = spark.createDataFrame(rows, schema)

df.printSchema()
df.show()

Out:

root
 |-- name: string (nullable = true)
 |-- age: integer (nullable = true)

+-------+---+
|   name|age|
+-------+---+
|Severin| 33|
|   John| 48|
+-------+---+

In the pyspark docs (link) you can find more details about the createDataFrame function.

like image 155
Daniel de Paula Avatar answered May 15 '23 03:05

Daniel de Paula