Sunday 18 August 2019

mysql - php server handling many requests

I have a php file that accesses a mysql database in the background given a few constraints and then returns some data.



The only issue is, if I get an influx of requests all at once then the last requests have to wait a while to get their response.



Is there a better way to handle this kind of thing so that the requests are fast and noone has to wait?



** Added Info




Currently my script takes in a few POST parameters and then goes to the database and verifies some information and then echoes a json encoded response string.



If I make 1000 requests through a XMLHTTPRequests in JS then the 1000's request comes back almost a minute later.



As far as I know php handles requests one at a time, but then how do major sites like Facebook handle when thousands of thousands of users update info at the same time?

No comments:

Post a Comment

php - file_get_contents shows unexpected output while reading a file

I want to output an inline jpg image as a base64 encoded string, however when I do this : $contents = file_get_contents($filename); print &q...