Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Cassandra .csv import error:batch too large

I'm trying to import data from a .csv file to Cassandra 3.2.1 via copy command.In the file are only 299 rows with 14 columns. I get the Error:

Failed to import 299 rows: InvalidRequest - code=2200 [Invalid query] message="Batch too large"

I used the following copy comand and tryied to increase the batch size:

copy table (Col1,Col2,...)from 'file.csv' with delimiter =';' and header = true and MAXBATCHSIZE = 5000;

I think 299 rows are not too much to import to cassandra or i am wrong?

like image 582
Emlon Avatar asked Apr 14 '16 08:04

Emlon


2 Answers

Adding the CHUNKSIZE keyword resolved the problem for me.

e.g. copy event_stats_user from '/home/kiren/dumps/event_stats_user.csv ' with CHUNKSIZE=1 ;

like image 63
kirenpillay Avatar answered Sep 20 '22 14:09

kirenpillay


The error you're encountering is a server-side error message, saying that the size (in term of bytes count) of your batch insert is too large.

This batch size is defined in the cassandra.yaml file:

# Log WARN on any batch size exceeding this value. 5kb per batch by default.
# Caution should be taken on increasing the size of this threshold as it can lead to node instability.
batch_size_warn_threshold_in_kb: 5

# Fail any batch exceeding this value. 50kb (10x warn threshold) by default.
batch_size_fail_threshold_in_kb: 50

If you insert a lot of big columns (in size) you may reach quickly this threshold. Try to reduce MAXBATCHSIZE to 200.

More info on COPY options here

like image 29
doanduyhai Avatar answered Sep 20 '22 14:09

doanduyhai