all.
I am using the logstash to import some csv files into the elasticsearch and I found the speed is too slow.
the config is :
input {
stdin{}
}
filter {
csv{
columns=>['date','metric1','id','metric2','country_id','metric3','region_id']
separator=>","
}
mutate {
convert => [ "id", "integer" ]
convert => [ "country_id", "integer" ]
convert => [ "region_id", "float" ]
}
}
output {
elasticsearch {
action => "index"
protocol=>http
host => "10.64.201.***"
index => "csv_test_data_01"
workers => 1
}
stdout {
codec => rubydebug
}
}
10.64.201.*** is the master ip address of the elasticsearch cluster and there are three nodes in this cluster.
the csv files are stored in one of these three nodes.
I simply just use command : blablabla -f **.config < csv files
Then it begins to import these csv files into elasticsearch cluster.
But the speed is too slow.
Any better solutions for this case? Or I did something wrong?
Should start by isolating the problem:
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With