Hi all, I am working on a project having for tracking vehicle's gps location. Here the issue is, each 5 seconds the vehicles are sending the current location (lat,lng) to server via API call. There are about 10,000 vehicle expected to have in the fleet.

Code:
Per Minute => 10,000 * 12           = 120,000 (rows)
Per Day    => 120,000 * (60*24)     = 172,800,000 (rows)
we expect to have the retention for at-least 3 month which ends up in having the data of 172,800,000*(30*3) = 15.55 Billion rows.

My favourite choice for any database development is MySQL. But this kind of huge data, mysql becomes very slow when on insert queries, it become worsen when the db rows are crossing 10 millions and expected to write in higher frequency. To solve this case, I chose MongoDB. MongoDB is performing very well in insert queries, i do not see any performance lag in inserting. The issue arrise in MongoDB when performing select queries. Once the database is loaded at least 100 million rows, the select queries are becoming very slow.

I am looking for a best suggestion here. If my arch is wrong or if I need to change to different DB, please suggest me a good design.

Thank you,