My client has a huge database containing just three fields:
This database has got over few billion entries. I have no previous experience in handling such large amounts of data.
He wants me to design an interface using AJAX (like Google) to search this database. My queries are as slow as turtle.
What is best way to search text fields in such a large database? If the user is typing wrong spelling on interface, how can I return what he wanted ?
Oracle has provided high-quality database solutions since the 1970s. The most recent version of Oracle Database was designed to integrate with cloud-based systems, and it allows you to manage massive databases with billions of records.
MongoDB is suitable for hierarchical data storage and is almost 100 times faster than Relational Database Management System (RDBMS).
Can MySQL handle 1 billion records? Can MySQL handle 100 million records? Yeah, it can handle billions of records. If you properly index tables, they fit in memory and your queries are written properly then it shouldn't be an issue.
Indexing is a data structure technique that allows you to quickly retrieve records from a database file.
If you are using FULLTEXT indexes, you're correctly writing your queries, and the speed in which the results are returned are not adequate, you are entering a territory where MySQL may simply not be sufficient for you..
You may be able to tweak settings, purchase enough RAM to make sure that your entire data-set fits 100% in memory. It's definitely true that performance gains could be huge there.
I'd definitely recommend looking into tweaks of your mysql configuration. We've had some silly settings in the past. Operating system defaults tend to really suck!
However, if you have trouble at that point, you can:
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With