I would like some advice on the best option, as I am having slow retrieval( over 30 seconds) of data on my web-based API.
I have multiple IoT Sensors(over 100), which will also be growing that TX updated sensor data to my IoT Hub, which then gets saved onto database or storage.
Previously I used to save all my sensor data to a SQL table, but as data grew, I found it was getting very slow, therefore I used Azure Table Storage. Each Sensor has got its own individual table storage, the partition key is the month and year(e.g. 202012), and the row key is a timestamp(e.g. 0002518033824243332546).
This proved to be much faster, as the amount of sensor data had reduced, as each sensor has its own table, but as the table grows for any particular sensor and I need to retrieve data across a longer period of time(1 month) this becomes very slow again. Each sensor TX’s an update every 1 min, therefore each day produces 1400 records and 1 month would have about 44,640 records.
Is there any better solution for my requirement?
Would having an individual SQL table for each sensor be a good idea? How many tables can there be in SQL storage?
Thank You