I want to export a complete dataframe into a table which is already created in a database(postgresql) and contains the similar data.
I found few questions explaining about dbwrite table (....overwrite = TRUE), i don't want to overwrite the data which is already present in my table. i just want to update my table with the dataframe from r console.
can someone let me know how can i do this..
something like this
dbInsertTable(con, df, tablename = "MyTable")
Instead of uploading your pandas DataFrames to your PostgreSQL database using the pandas. to_sql() function, you can write the data to a CSV file and COPY the file into PostgreSQL, which is considerably faster, as I'll demonstrate below.
You'll need dbWriteTable
Assuming you don't use row names in your data frame you'd do
dbWriteTable(con, "MyTable", df, row.names=FALSE, append=TRUE)
if you want your row names from the df to be a column in your database table then you'd set that option to TRUE
. If your table is in a schema other than the public schema then you'd do c('myschema', 'MyTable')
instead of the intuitive 'myschema.MyTable'. Also, the columns of your dataframe need to be in the same order as the columns in your database's table. It matches based on order not name.
As an aside, you probably shouldn't use capital letters in your postgres table or column names because then you need to quote them. If you're really using capital letters then you'd need something like dbWriteTable(con, '"MyTable"', df, row.names=FALSE, append=TRUE)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With