I have a script that creates a file using fwrite and right after creating it it sends and email with this file anexed.
When I run this script I get an error "File does no exists". When I run the script again it all works.
So I guess that it tries to send the file to fast after creating it and the server maybe needs a few more ms before being able to send it.
Does anyone know this problem? Any solutions?
Delays should not be necessary; it is possible your script does not close the file, but PHP and other script languages will auto close open file handles at the end of execution, which means that the file will exist for the second run.
Try sleep(1) before attaching the file, this will cause PHP to wait 1 second before trying to access the file which should be plenty of time for it to be created. This should tell you if the file creation speed is the problem or not.
Related
I'm running a script which reads a lot of data from the database. The script itself gets triggered by another script. (To ensure the script will not run twice or more times) Unfortunately I keep running into a server timeout.I tried to solve it with
set_time_limit(0);
and
ini_set('max_execution_time', 0);
but that didn't do the trick.
What the scirpt does is, it generates a CSV file which then should be downloaded by the user ( via browser ).
The file itself has to be created just in time, so I cannot create it during night and push it or anything like this.
Is there something like a best-practise?
Since I cannot change the generating script itself, how can I ensure, the file gets generated and the user gets informed that the file is ready for download?
(I can't use mail)?
Thank you so much in advance.
I have a php script with an unlimited while, it must run 24/7
in another php file, how can I check is that file running on server or stopped?
how can i send a signal to apache to stop and re-execute that file?
I'm going to assign numbers to files. The file that is running 24/7 will be the first file and the file that will change the state of the first one will be called the second file.
Now, the first file can write to a file or in database, let's say, every 10 minutes. This way you know if it's running by checking that file and the last date when it wrote that file. So you create a second file or database table and write in it the state you want the first file to be. Example: active or disabled. Now you read the file/table with the first file, if it's disabled, you stop executing the script.
easiest might be to save timestamped status in memcache or other shared location, and have the other php script check the status timestamp
it's easy to kill an apache process and hit the page again, that will restart the script. Or you can add a signal handler to restart on SIGHUP
You cannot check whether a specific file is running. You'll have to check whether a process is still running with that file. That also means this isn't something Apache can do for you. You'll either have to:
1) use an OS-dependant kill-signal to the process running the script
2) check for a kill-signal from inside the script
The former requires a lot of privilige on the server, so it's probably easier to do the second.
The easiest way is for the running file to write to a database or file somewhere to signify that it´s still going, and to read the same location for a signal to stop every so often. If the running process sees a stop-signal, it can simply break from whatever loop is keeping it going.
The second script can set the signal at whatever point and for whatever reason, and the other script will quickly after terminate.
If the first script terminates, write a file to a disk called error.txt, and then make the second script check for this every minute or so.
The second script once spots an error.txt file will be signaled to restart your first script. More realtime would be to use a database and with the current timestamp.
What is the best way to generate an excel file and inform the user after that?
I am using PHPExcel to generate an excel file from an MSSQL Server to a webserver and allow a user to download it using a link. The problem is that the each time we try to execute the PHP script it always throws a fast-cgi timeout error. The script needs to read up to 2000 - 5000 rows of data.
We tried to execute it via command prompt using exec() and Shell. It successfully generates the file in the server, but we don't have a way/method in informing the user after the script is completed.
exec() should return the result of running the external program - can't you use it? You can move generated file to a directory that is reachable for user and just give him the URL to the file.
I have a command line PHP script that runs constantly (infinite loop) on my server in a 'screen' session. The PHP script outputs various lines of data using echo.
What I would like to do is create a PHP web script to interface the command line script so that I can view the echo output without having to SSH into the server.
I had considered writing/piping all of the echo statements to a text file, and then having the web script read the text file. The problem here is that the text file will grow to several megabytes in the space of only a few minutes.
Does anyone know of a more elegant solution?
I think expect_popen will work for you, if you have it available.
Another option is to used named pipes - no disk usage, the reading end has output available as it comes.
The CLI script can write to a file like so:
file_put_contents( '/var/log/cli-log-'.date('YmdHi').'.log', $data );
Thereby a new log file being created every minute to keep the file size down. You can then clean up the directory at that point, deleting previous log files or moving them or whatever you want to do.
Then the web script can read from the current log file like so:
$log = file_get_contents( '/var/log/cli-log-'.date('YmdHi').'.log' );
As Elias Van Ootegem suggested, I would definitely recommend a cron instead of an constantly running script.
If you want to view the data from a web script you can do a few things....one is write the data to a log file or a database so you can pull it out later....I would consider limiting what you output if you there is so much data (if that is a possiblity).
I have a lot of crons email me data, not sure if that would work for you but I figured I would mention it.
The most elegant suggestion I can think of is to run the commands using exec in a web script which will directly output to the browse if you use : http://php.net/manual/en/function.flush.php
I am trying to stop my cron script from allowing it to run in parallel. I need it so that if there is no current execution of it, the script will be allowed to run until it is complete, the script timesout or an exception occurs.
I have been trying to use the PHP flock function to engage a file lock, run the script and then release the lock. However, it still looks like I am able to run the script multiple times in parallel. Am I missing something?
Btw, I am developing on Mac OS X with the Mac filesystem, maybe this is the reason the file locks are being ignored? Though the PHP documentation only looks about NTFS filesystems?
// Construct cron lock file path
$cronLockFilePath = realpath(APPLICATION_PATH . '/locks/cron');
// Get cron lock file
$cronLockFile = fopen($cronLockFilePath, 'r');
// Lock cron lock file
if (flock($cronLockFile, LOCK_EX)) {
echo 'lock';
sleep(10);
} else {
echo 'no lock';
}
Your idea is basically correct, but tinkering with file locks generally leads to strange behaviour.
Just create a file on script start and delete it in the end. The presense of the file will indicate if the cron is already running. Make absolutely sure, that the file is deleted in the end, even if the cron runs into an error halfway through.
From documentation:
Warning
On some operating systems flock() is
implemented at the process level. When
using a multithreaded server API like
ISAPI you may not be able to rely on
flock() to protect files against other
PHP scripts running in parallel
threads of the same server instance!
You can try to create and delete file, or write something in to it.
I think what you could do is write a regular file somewhere (lock.txt or something) when script starts to execute, without any flocks, and remove it when the script stops running. And then always check upon initialization whether that file already exists - another instance running.