Hi, i am in need of performance optimizations, my site has gotten pretty big over the past year and the script is started to hold me back from expanding.
First of all here is my file server specs.
--HW--
Intel E3 1270v3
32GB RAM
120GB SSD
4x4TB HDDs
10Gbps Unmetered
--SW--
Nginx
php-fpm
The problem i have is at peak times when there can be around 1000 people all downloading at once. This means that there must be 1000 php-fpm processes or else the server will grind to a halt. On top of that the max execution time must be set to a high number (I have mine at 4 hours) otherwise the downloads will just cut off when the time is up.
I wan't to know is it possible to cut the file download script off as soon as the download is started so its just pulling from nginx only? If not how does other big file sharing sites do it? This is the problem i have at the moment. http://puu.sh/aIwK0/bc490b0509.png
Thanks
First of all here is my file server specs.
--HW--
Intel E3 1270v3
32GB RAM
120GB SSD
4x4TB HDDs
10Gbps Unmetered
--SW--
Nginx
php-fpm
The problem i have is at peak times when there can be around 1000 people all downloading at once. This means that there must be 1000 php-fpm processes or else the server will grind to a halt. On top of that the max execution time must be set to a high number (I have mine at 4 hours) otherwise the downloads will just cut off when the time is up.
I wan't to know is it possible to cut the file download script off as soon as the download is started so its just pulling from nginx only? If not how does other big file sharing sites do it? This is the problem i have at the moment. http://puu.sh/aIwK0/bc490b0509.png
Thanks