Quantcast
Channel: StackExchange Replication Questions
Viewing all articles
Browse latest Browse all 17268

Database design for a lot of rows with data archive

$
0
0

i'm new to database design on larger scale. I have some knowledge and experience with MySQL but now i have some performance issues.

Currently i'm using MySQL with 3 databases:

  • History
  • Snapshot
  • Static

In history database i'm inserting data. For one user there are approx 5mio rows on month (users are slowly growing every week). As my select queries have gotten slow probbably due to large amount of data i have triggers that copy inserts into snapshot database and delete older entires. Select statements are very quick on snapshot database. But as i add more triggers, mysql seems to fail to whitstand the load. The static database is just users table and some metadata.

So my question is, how can i achieve push/pop with my tables using MySQL or any other database technology?

I want to insert data into snapshot database and all data that has timestamp older than 12 hours should be moved to history (for later data analysis - once or twice a week) and deleted from snapshot so it keeps minimal amount of data to work with. Is this achievable with MySQL or something else? What would be recommended hardware setup for such database design?

As of writing and reading from database i do most of writing from python with pymysql, also some reading and most of reading with NodeJS mysql plugin.

I hope there is some not too complicated solution for my issue. But i will be happy with any advice or guide direction on that to use.


Viewing all articles
Browse latest Browse all 17268

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>