When importing a record with a large field inside (longer than 124214 characters) I am getting the error
"field larger than field limit (131072)"
I saw form other posts how to solve this on Python but I don't know if it is possible on CQLSH.
Thanks
Take a look at this answer:
_csv.Error: field larger than field limit (131072)
You will need to add this solution to the top of the cqlsh file. So after:
import csv
import getpass
csv.field_size_limit(sys.maxsize)
Rather than hacking into the cqlsh file, there is a standard option provided by cassandra to change the field_size_limit
. The Cassandra installation includes a cqlshrc.sample
file in the conf directory of a tarball distribution. In this file the field_size_limit
option can be found and changed. To make cqlsh read it's options from this file, you need to copy the cqlshrc.sample
file from the conf directory to the hidden .cassandra
folder of your user home folder, and renaming it to cqlshrc.
Cassandra documentation contains more details about it: http://docs.datastax.com/en/cql/3.1/cql/cql_reference/cqlsh.html?scroll=refCqlsh__cqlshUsingCqlshrc
Download & extract the cassandra distribution from https://cassandra.apache.org/download/
You will find cqlshrc.sample file in conf directory after you extracted
Copy the cqlshrc.sample to ~/.cassandra and rename it to cqlshrc
Open the cqlshrc file and change ; field_size_limit = 131072
to field_size_limit = 1000000000
Don't forget to remove ";" in the above step
Open a new terminal & run your queries
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With