Re: Moved to different Server - Now all Existing Files corru
It seems to work for new zip files so the script looks ok.
I'd have a look at the ones in storage. You can find out the path by looking in your database in the files table. If has the storage path for the uploaded file. Download it...
Re: Moved to different Server - Now all Existing Files corru
Are they downloading with the correct filesize or as zero bytes?
What happens if you download one of the files directly from the server, name to .zip and open it? Does it open. Just trying to see if it's script/server config related...
SMTP will work with Google. Use these settings:
The method for sending emails via the script: smtp
Whether your SMTP server requires authentication: yes
Your SMTP host if you've selected SMTP email method: ssl://smtp.gmail.com
Your SMTP username if SMTP auth is required: [Your Gmail Email...
Hi Martin,
If you ever need any urgent support from us please raise a ticket. I've had a look and you've used the ticketing system for pre-sales but not support issues. We don't always monitor these forums so might miss things posted here. :)
Thanks,
Adam.
The url upload check is via this function:
if(UserPeer::getAvailableFileStorage($Auth->id) <= 0)
So this checks the available space for the current logged in user.
This limit is pulled from the user account 'storageLimitOverride' value. (in admin, edit user, storage override)
or
The global...
This could be limits on your NGINX config. Do you have this line in your NGINX settings file:
client_max_body_size 5G;
As detailed here: http://forum.mfscripts.com/viewtopic.php?f=20&t=388&p=4583&hilit=nginx#p4551
If that doesn't work try:
client_max_body_size 2G;
Remember to restart Nginx...
Hi,
I've just tried a registration on your site but I'm not getting any emails at all. I wanted to see if it's a email client issue or not.
Have you tried sending via SMTP? You'll probably get less emails in spam folders aswell.
Thanks,
Adam.
I spent some time yesterday working over this to get these performance issues resolved. The main problem was php-fpm processes being maxed out because of the current downloads. These were all being handled via PHP, so if a download took 6 hours, that would tie up a single PHP process for this...
At the top of /plugins/fileleech/includes/_append_url_upload_handler.php mod this:
define('PLOWDOWN_PATH', '/usr/local/bin/plowdown');
to
define('PLOWDOWN_PATH', '/plowdown');
If that doesn't work mod it to whatever path plowdown is within.