We are a IoT company that provide services for transportation and logistics companies. As a infrastructure service provider we offer GPS tracking devices to our client.
Although the format of GPS tracking data is very neat (gpsId, longitude, latitude, speed, direction, reportTime, etc), but amount of it is very big. Every device report GPS tracking information per 10 seconds, and we have 100k devices, thus 60*60*24*100000/10 = 864M rows of new data generated every day.
Using the data collected by GPS tracking device of a particular vehicle, client can review the traces of this vehicle within a given time period (for example, last 10 days, will need 60*60*24*10/10 = 86.4K rows of data).
Currently we use MySQL as storage medium, and take advantage of sharding and table partitioning(based on gpsId) of it. But since the data is so big and query on it is so frequent, so I wonder if we can use a NoSQL storage to fit this scenario better?
Historical data is also useful for data analysis.
Appreciated.