I can write that there are no ready to use scripts since your database schema would not match my schema. You created all your pages and code so for the test, just a little more work.
I have a high-workload server with mysql. DB weighs about 500GB. Average load to DB per second is 7000 requests. Of these, approximately 4000 SELECT queries and 3000 INSERT, UPDATE, DELETE queries. The number of requests can reach 12000 - 14000 on peaks. From the server goes slave replication to another much weaker server. It pulls only requests for writing from the Master-server (about 3000 requests) and copes with it normally.
I want to update the Master server, so I want to disable the replica temporarily, and switch all requests to it. But I'm not sure that this server will cope with such load. More precisely, it’ll surely withstand the recording load, but I can hardly imagine what will happen when all SELECT queries will be redirected on it. The master server has 8 times bigger RAM.
Therefore, I want to evaluate the possibilities of this server somehow. The first thing that came to my mind was to log all SELECT requests into a file within an hour. Then write a script that executes all these queries from the file already to the database of the tested server. And monitor the load. It is important to test the maximum approximate load.
Perhaps ready scripts / software for these purposes exist already. Please advise something.
Link to server shop removed by moderator.