in a PHP project, I need to download CSV files from a FTP server. I'm using PHP ftp_XXX function to do this.
I'm working on two separate computers, one can download FTP files with no problem; the other one initiate the FTP connection, open and create a file on my disk but after a few seconds (sounds like a timeout), the script end with this error:
PHP Warning: ftp_get(): Opening BINARY mode data connection for...
I've already tried to use passive mode, the connection is closed at the end of my script and the strange thing is that this is working on another computer, and on my server.
So here are my questions:
1) do you have any idea why this is happening?
2) are there configuration in php.ini or apache to enable properly PHP FTP?
Thanks you.
Cyril
Maybe you exceeded maximum execution time.
Try to increase it:
http://php.net/manual/en/function.set-time-limit.php
Related
I want to zip a large folder of 50K files on Windows Server. I'm currently using this code:
include_once("CreateZipFile.inc.php");
$createZipFile=new CreateZipFile;
$directoryToZip="repository";
$outputDir=".";
$zipName="CreateZipFileWithPHP.zip";
define("ZIP_DIR",1); //
if(ZIP_DIR)
{
//Code toZip a directory and all its files/subdirectories
$createZipFile->zipDirectory($directoryToZip,$outputDir);
}else
{
//?
}
$fd=fopen($zipName, "wb");
$out=fwrite($fd,$createZipFile->getZippedfile());
fclose($fd);
$createZipFile->forceDownload($zipName);
#unlink($zipName);
Everything works fine until around 2K image files. But this is not what I want to get. I'm willing to process to zip like 50K images at least. Meanwhile my script gets this error:
Fatal error: Maximum execution time of 360 seconds exceeded in C:\xampp\htdocs\filemanager\CreateZipFile.inc.php on line 92
$newOffset = strlen(implode("", $this->compressedData));
I'm searching for any solution to proceed such a huge amount of files. I currently use XAMPP on Windows Server 2008 Standard. Is there any possibility to make small parts of the zips, use a system command and maybe external tool to pack them and then send it to header to download?
http://pastebin.com/iHfT6x69 for CreateZipFile.inc.php
try this .. to increase execution time
ini_set('max_execution_time', 500);
500 is number os seconds change it to whatever you lilke
Do you need a smaller file or a fast served file?
for fast serving without compression and without memory leak you could try to use the system command with a zip software like gzip and turning the compression of.
the files would probably get huge but would be served fast as one file.
I have a PHP script that downloads files with direct link from a remote server that I own. Sometimes large files (~500-600 MB) and sometimes small files (~50-100 MB).
Some code from the script:
$links[0]="file_1";
$links[0]="file_2";
$links[0]="file_3";
for($i=0;$i<count($links);$i++){
$file_link=download_file($links[$i]); //this function downloads the file with curl and returns the path to the downloaded file in local server
echo "Download complete";
rename($file_link,"some other_path/..."); //this moves the downloaded file to some other location
echo "Downloaded file moved";
echo "Download complete";
}
My problem is if I download large file and run the script from web browser, it takes upto 5-10 minutes to complete and the script echos upto "Download complete" then it dies completely. I always find that the file that was being downloaded before the script dies is 100% downloaded.
On the other hand if I download small files like 50-100MB from web browser or run the script from command shell this problem does not occur at all and the script completes fully.
I am using my own VPS for this and do not have any time limit in the server. There is no fatal error or memory overload problem.
I also used ssh2_sftp to copy files from the remote server. But same problem when I run from web browser. It always downloads the file, executes the next line and then dies! Very strange!
What should I do to get over this problem?
To make sure you can download larger files, you will have to make sure that there is:
enough memory available for php
the maximum execution time limit is set high enough.
Judging from what you said about ssh2_sftp (i assume you are running it via php) your problem is the 2nd one. Check your error(-logs) to find if that truly is your error. If so you simply increase the maximum execution time in your settings/php.ini and that should fix it.
Note: I would encourage you not to let PHP handle these large files. Call some program (via system() or exec()) that will do the download for you as PHP still has garbage collection issues.
I have got a huge xml file (~ 85 Mo) and I would like to open it with XML Reader (then my script read selected lines). I have downloaded it on my PC and my script works (using Wamp).
Now I would like to do the same online. Server login is aaa and password is bbb (of course it's an example).
I tried the following statement:
$xml = new XMLReader();
if ($xml->open('ftp://aaa:bbb#ftp.website.com/myfile.xml')){
echo 'OK';
}
while($xml->read()){
// my script here...
}
It seems I am wrong because my web browser indicates me that the page is too long to load. What is the good way to proceed? Or did I miss something important?
Since the file in question is an XML it is not wise to partially download it since it will probably break the xml structure making the parser to fail.
You could get a cronjob to retrieve the file occasionally and you would open it from a local location on the server, or retrieve it once and cache it locally so that it would speed subsequent requests.
http://davidwalsh.name/increase-php-script-execution-time-limit-ini_set
The default execution time is 20 secs or 30, increase it to 1 minute and retry
I would use curl to connect to the FTP server and download the file.
http://php.net/manual/en/book.curl.php
Your probably best to download the file to the server first using PHP's FTP functions and then open the file locally. I wouldn't rely on using FTP within the XML library, you'll get more flexibility this way.
I have a site which is hosted on a dev site for demonstration to the client, and everything works without problem. However, when I download the files and database to my local EasyPHP installation, I receive the following error:
Fatal error: Maximum execution time of
30 seconds exceeded in C:\Program
Files (x86)\EasyPHP-5.3.4.0\www\PC
Estimating\classes\database.class.php
on line 23
The database details for the connection are correct, as the Database object is already used on part of the template before this error is shown.
My question is, why does the system work fine on a live server, but not on EasyPHP?
You should check the max_execution_time setting in the php.ini files on your server and on your local installation.
btw... what is done in line 23 ?
Copied from my comment to make it easier to find the solution:
some things really runs slower on windows... while on mac/unix the php connects to mysql using a file socket while it should use tcpip in windows. Try using "127.0.0.1" instead of "localhost" when connecting to the db
These issue have two possible solutions:
1) Increase the max_execution_time in your php.ini . First of all, locate this file, and then edit it. Locate this line:
max_execution_time=30
and replace by:
max_execution_time=120
And then restart your webserver.
This will increase from 30 seconds to 120 seconds. You can increase even more, according to your application needs.
2) If this setting doesn't solve the problem, you may have to look into your PHP application, because there may be an infinite loop or something similar.
More details about this problem:
https://www.copahost.com/blog/increase-php-max-execution-time/
Because your PC is slow compared to server and/or your code is really badly optimised
A very strange thing is happening. I am running a script on a new server (it works on my current server and laptop).
The strange thing is that I only get it to (sort of) work when I increase memory limit to 1024M (!). It is extracting a large zip file and going through the files, so I thought it was normal. Instead of this script terminating or ending with errors. I get an error from my browser:
The server at www.localhost.com is
taking too long to respond.
Localhost.com? The web server is just localhost:9090 and I can see Apache is still running. Maybe Apache crashes momentarily and it can't find the server? But nothing about apache crashing in the log files.
This isn't a server issue, its more to do with my PHP script and memory usage I think, so no need to move to server fault.
What could be the problem? How can I narrow do the cause, I am at loss here!
The server is a windows server running Apache 2.2 with PHP version 5.3.2. May laptop and the other working server are running version 5.3.0 and 5.3.1 for PHP.
Thanks all for any help
Ensure that,
ini_set('display_errors','On');
ini_set('error_reporting',E_ALL);
ini_set('max_execution_time', 180);
ini_set('memory_limit','1024MB' );
I'd pop this in the top of the script and see what comes out. It should show you errors and the like.
The other thing, have you checked fopen and the path of the file which it's loading?
Abs said,
check files being zipped up can be zipped by PHP (permissions
especially on a Windows OS with multi
users)
I kept getting this problem too, and none of these sites really helped until I started looking at the same thing for people using Internet Explorer. The way I fixed it is to open up the system hosts file, located at C:\Windows\System32\drivers\etc\hosts, and then uncomment out the line that mentions ::1, which is needed for IPv6. After that it worked fine.
Somehow your system's munged up and isn't treating localhost as the local 127.0.0.1 address. Is your hosts file properly configured? This is most likely why you're getting the "too long to respond" error:
marc#panic:~$ host www.localhost.com
www.localhost.com has address 64.99.64.32
marc#panic:~$ wget www.localhost.com
--2010-08-03 22:41:05-- http://www.localhost.com/
Resolving www.localhost.com... 64.99.64.32
Connecting to www.localhost.com|64.99.64.32|:80... connected.
HTTP request sent, awaiting response... Read error (Connection reset by peer) in headers.
Retrying.
www.localhost.com is full valid hostname as far as the DNS system is concerned.
I am not a php guru by any means but are you writing the extracted files to a temporary local storage location that is within the scope of the application? Because if you are not then I think what is happening is that the application is storing the zip file and extracted files in memory and then is attempting to read them. So if it is a large zip and/or the extracted files are large that would introduce a huge amount of overhead on top of the overhead introduced by your read and processing actions.
So if you are not already I would extracted the files and write them to disk in their own folder, dispose of the zip file at this point, and then iterate over the files in your newly created directory and perform whatever actions you need to on them.