I want to pull a table from Teradata as a Python data-frame. I know how to accomplish this step. Next I want to run algorithms on the data to transform it however I want. Once I am done with manipulating the data in Python, I want the resulting data-frame to be saved as a new table in Teradata so that I can perform joins with other tables in the database. My question is how do I save a python data-frame back into a database, I would like to do this inside of python using a script.
Related Documentation Use the to_sql() DataFrame method to create a table in Vantagebased on a teradataml DataFrame. The function takes a table name as an argument, and generates DDL and DML queries that creates a table in Vantage. You can also specify the name of the schema in the database to write the table to.
Now, the environment is set and test dataframe is created. we can use dataframe.write method to copy dataframe into Teradata tables. For example, following piece of code will establish jdbc connection with Teradata database and move dataframe content into the table.
This example uses the to_sql() function to create a new table "sales_df1" based on the existing table "sales" in Teradata Vantage. The new table "sales_df1" includes only the columns "accounts", "Jan", and "Feb. Create a teradataml DataFrame "df" from the existing table "sales". df = DataFrame("sales")
>>> spark=SparkSession.builder.appName ( "SparktoTeradata" ).enableHiveSupport ().getOrCreate () You can use Spark SQL to read Hive table and create test dataframe that we are going to load into Teradata table.
One option is to use fastterdata
, specifically the load_table
function:
load_table(abs_path, df, table_name, env, db, connector = "teradata", clear_table=True)
Loads a pandas dataframe from memory into teradata via the optimized fastload functionality.
Note that you need to install requirements listed here.
although I never performed it by my self, but in theory each of following looks promising:
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With