For example, let's say you have an eBay clone and you are storing all the data in a SQL database. One way you could design your search is to query your database for every search, so say someone searches for "red sweatshirt" you could do something like SELECT * FROM database WHERE Match("red sweatshirt")
or something along those lines. But surely these companies cache things (maybe using redis?) since I would assume doing a SQL query everytime a user searches for something would be way too slow. However, how do you solve the problem of things being constantly added. Like what if every second there are 20 new items being posted that match the search criteria "red sweatshirt". How can you resolve this since if you use a cache then you would be missing out on items.
This is a basic question I was pondering for myself that I couldn't come up with an answer too, I was curious if some more experienced engineers could enlighten me. I generally wanted to get an idea of how companies design their searches in terms of requests and what database structure they would use.
Aucun commentaire:
Enregistrer un commentaire