Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to Populate a Elastic Search Index from text file?

I'm planning to use an Elastic Search index to store an huge city database with ~2.9 milion records, and use it as search engine at my Laravel Application.

The question is: I both have the cities at MySQL database and at CSV File. The file have ~300MB.

How can I import it to a index fastest?

like image 343
Elias Soares Avatar asked May 26 '15 22:05

Elias Soares


1 Answers

I've solved this importing using Logstash.

My import script is this:

input {
      file {
          path => ["/home/user/location_cities.txt"]
          type => "city"
          start_position => "beginning"
      }
}

filter {
    csv {
        columns => ["region", "subregion", "ufi", "uni", "dsg", "cc_fips", "cc_iso", "full_name", "full_name_nd", "sort_name", "adm1", "adm1_full_name", "adm2", "adm2_full_name"]
        separator => "  "
        remove_field => [ "host", "message", "path" ]
    }
}

output {
    elasticsearch {
        action => "index"
        protocol => "http"
        host => "127.0.0.1"
        port => "9200"
        index => "location"
        workers => 4
    }
}

This script will import a tab separated file without delimiters into an index called location with type city.

To run the script, need to run bin/logstash -f import_script_file at the folder that you installed/extracted the Logstash.

like image 76
Elias Soares Avatar answered Nov 10 '22 21:11

Elias Soares