recently I started working on ElasticSearch (ES) implementation into legacy e-commerce app written in PHP using MySQL. I am completely new to all this stuff and reading the docs is fine, yet I really need somebody with experience to advise me.
From the ES documentation I was able to setup a new cluster and I also found out that rivers are deprecated and should be replaced, so I replaced them with Logstash and JDBC MySQL connector.
At this point I have:
The database structure of the application is not really optimal and is very hard to replace, but I'd like to replicate it into the ES index in the best possible way.
DB Structure:
Products
+-------------------------------+-------+--------+
| Id | Title | Price |
+-------------------------------+-------+--------+
| 00c8234d71c4e94f725cd432ebc04 | Alpha | 589,00 |
| 018357657529fef056cf396626812 | Beta | 355,00 |
| 01a2c32ceeff0fc6b7dd4fc4302ab | Gamma | 0,00 |
+-------------------------------+-------+--------+
Flags
+------------+-------------+
| Id | Title |
+------------+-------------+
| sellout | Sellout |
| discount | Discount |
| topproduct | Top Product |
+------------+-------------+
flagsProducts (n:m pivot)
+------+-------------------------------+------------+------------+
| Id | ProductId | FlagId | ExternalId |
+------+-------------------------------+------------+------------+
| 1552 | 00c8234d71c4e94f725cd432ebc04 | sellout | NULL |
| 2845 | 00c8234d71c4e94f725cd432ebc04 | topproduct | NULL |
| 9689 | 018357657529fef056cf396626812 | discount | NULL |
| 4841 | 01a2c32ceeff0fc6b7dd4fc4302ab | discount | NULL |
+------+-------------------------------+------------+------------+
Those string IDs are a complete disaster (but I have to deal with them now). At first I thought I should do a flat structure of Products index to ES, but what about multiple entity bindings?
Basically, a type in Elasticsearch represented a class of similar documents and had a name such as customer or item . Lucene has no concept of document data types, so Elasticsearch would store the type name of each document in a metadata field of a document called _type.
The schema in Elasticsearch is a mapping that describes the the fields in the JSON documents along with their data type, as well as how they should be indexed in the Lucene indexes that lie under the hood. Because of this, in Elasticsearch terms, we usually call this schema a “mapping”.
That's a great start!
I would definitely flatten it all out (i.e. denormalize) and come up with product documents that look like the one below. That way you get rid of the N:M relationship between products and flags by simply creating a flags
array for each product. It will thus be easier to query those flags.
{
"id": "00c8234d71c4e94f725cd432ebc04",
"title": "Alpha",
"price": 589.0,
"flags": ["Sellout", "Top Product"]
}
{
"id": "018357657529fef056cf396626812",
"title": "Beta",
"price": 355.0,
"flags": ["Discount"]
}
{
"id": "01a2c32ceeff0fc6b7dd4fc4302ab",
"title": "Gamma",
"price": 0.0,
"flags": ["Discount"]
}
The product mapping type would look like this:
PUT products
{
"mappings": {
"product": {
"properties": {
"id": {
"type": "string",
"index": "not_analyzed"
},
"title": {
"type": "string"
},
"price": {
"type": "double",
"null_value": 0.0
},
"flags": {
"type": "string",
"index": "not_analyzed"
}
}
}
}
}
Since you have the logstash jdbc
input already, all you're missing is the proper SQL query to fetch the products and associated flags.
SELECT p.Id as id, p.Title as title, p.Price as price, GROUP_CONCAT(f.Title) as flags
FROM Products p
JOIN flagsProducts fp ON fp.ProductId = p.Id
JOIN Flags f ON fp.FlagId = f.id
GROUP BY p.Id
Which would get you rows like these:
+-------------------------------+-------+-------+---------------------+
| id | title | price | flags |
+-------------------------------+-------+-------+---------------------+
| 00c8234d71c4e94f725cd432ebc04 | Alpha | 589 | Sellout,Top product |
| 018357657529fef056cf396626812 | Beta | 355 | Discount |
| 01a2c32ceeff0fc6b7dd4fc4302ab | Gamma | 0 | Discount |
+-------------------------------+-------+-------+---------------------+
Using Logstash filters you can then split the flags
into an array and you're good to go.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With