I want to do a query with thousands of tuples. I need to save the first ID, last ID and date saved in a historic table by day in a new table. I have data from 2020 to 2022. Every day could 600.000 rows or more. I have thought two solutions:
Doing a query every time with limit 600.000 and save the first id, last id and date, all of this order by dates or ids.
Doing a query day by day and get the first and the last id.
The problems are that these querys could delay so much because i am doing orderings.
I´m doing this with SQL and need execute this in PHP with a cron every day to save the data of the day. First, i´m building the new table with the data of past.
Someone would know one tip or antoher form to do this.
THANKS!
My Answer:
One possible solution could be to create an index on the date column in your historic table. This will help speed up the query when you are filtering by date.
You could also consider partitioning your historic table by date, which can also improve query performance when dealing with large amounts of data.
Another option could be to use a window function in your query to get the first and last ID for each day. This can help avoid the need for ordering the data and potentially improve performance.
Overall, it's important to optimize your query and database structure to handle the large amount of data efficiently. You may also want to consider implementing some form of data archiving or purging to keep the size of your historic table manageable.
Rate this post
3 of 5 based on 6103 votesComments