I was using active record to get my stories and then generate a CSV, the standard way as done in the rails cast. But I have a lot of rows and it takes minutes. I think if I could get posgresql to do the csv rendering, then I could save some time.
Heres what I have right now:
query = "COPY stories TO STDOUT WITH CSV HEADER;"
results = ActiveRecord::Base.connection.execute(query);
But the results are empty for this query:
=> #<PG::Result:0x00000006ea0488 @connection=#<PG::Connection:0x00000006c62fb8 @socket_io=nil, @notice_receiver=nil, @notice_processor=nil>>
2.0.0-p247 :053 > result.count
=> 0
A better way of knowing:
2.0.0-p247 :059 > result.to_json
=> "[]"
I suspect my controller will look something like this:
format.csv { send_data raw_results }
This works for normal queries, I just can't figure out the SQL syntax to have the CSV results returned to rails.
UPDATE
Got the CSV export from 120000 msec down to 290 msec
My model:
def self.to_csv(story_ids)
csv = []
conn = ActiveRecord::Base.connection.raw_connection
conn.copy_data("COPY (SELECT * FROM stories WHERE stories.id IN (#{story_ids.join(',')})) TO STDOUT WITH (FORMAT CSV, HEADER TRUE, FORCE_QUOTE *, ESCAPE E'\\\\');") do
while row = conn.get_copy_data
csv.push(row)
end
end
csv.join("\r\n")
end
My controller:
send_data Story.to_csv(Story.order(:created_at).pluck(:id))
Arel is a powerful SQL AST manager that lets us appropriately combine selection statements for simple to very complicated queries. However, reader be cautioned – Arel is still a private API provided by Rails. Meaning that future versions of Rails could be subject to changes in Arel.
In order to seed the database with the information from the CSV file, you will need to know the file path where your CSV file is located. To me it made the most sense to place by CSV file in the lib folder in my Rails API project directory.
AFAIK you need to use the copy_data
method on the underlying PostgreSQL database connection for this:
- (Object) copy_data(sql)
call-seq:
conn.copy_data( sql ) {|sql_result| ... } -> PG::Result
Execute a copy process for transferring [sic] data to or from the server.
This issues the SQL
COPY
command via#exec
. The response to this (if there is no error in the command) is aPG::Result
object that is passed to the block, bearing a status code of PGRES_COPY_OUT or PGRES_COPY_IN (depending on the specified copy direction). The application should then use#put_copy_data
or#get_copy_data
to receive or transmit data rows and should return from the block when finished.
And there's even an example:
conn.copy_data "COPY my_table TO STDOUT CSV" do
while row=conn.get_copy_data
p row
end
end
ActiveRecord's wrapper for the raw database connection doesn't know what copy_data
is but you can use raw_connection
to unwrap it:
conn = ActiveRecord::Base.connection.raw_connection
csv = [ ]
conn.copy_data('copy stories to stdout with csv header') do
while row = conn.get_copy_data
csv.push(row)
end
end
That would leave you with an array of CSV strings in csv
(one CSV row per array entry) and you could csv.join("\r\n")
to get the final CSV data.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With