I am using php scripts as cron jobs.
I have this script, which opens a CSV file , parse its content to an array, then modify it and close it.
I added this script to the crontab.
The problem is when I delete that CSV file manually from the filesystem, the script still finds it and parse the old content as if that file was somehow cached.
When I execute the script using the command line, it works properly (does not find the CSV file).
I added at the beginning of the script clearstatcache() then opcache_reset() but nothing works
Thank you for your help
Related
I have created two php files, the perform two separate tasks.
1. The first script uses PHPSpreadsheet to create an populated my report template which is an xlsx file. This works perfectly when run directly through the browser (Google Chrome).
2. The second script uses PHPMailer to then pickup this file and attach it to an e-mail. Again this works perfectly when executed through the browser.
I have tried running the same scripts using a batch file, but this is where the first section fails. The Xlsx file never gets created, and hence an e-mail gets sent without attaching the document.
I have tried running each script individually via the batch script.
The second part (sending the mail) works as expected.
Creating the file via the batch script seems to be the issue.
REM This adds the folder containing php.exe to the path
PATH=%PATH%;C:\php
REM Change Directory to the folder containing your script
CD C:\Apache24\htdocs\phpspreadsheet\reporting
REM Execute
php report_v1.1.php
REM Change Directory to the folder containing your script
CD C:\Apache24\htdocs\Automation
REM Execute
php report.php
I would like to automate these two scripts to run sequentially, so that I can automate sending out the report to the relevant recipients. I do not get any error message from the first script, only a lot of random characters on the terminal screen when executing it via command line to test.
Updating the last part of my code to read
$writer->save(php_sapi_name() === 'cli' ? 'report_name.xlsx' : 'php://output');
did the trick. Thanks to #Hunman for the suggested solutions.
I am running my batch file and i am executing my php file through batch file,now the php file will create a text file. after, i will terminate my batch file the textfile cannot be renamed or cannot be deleted. it shows this error "The action cannot be completed because the file is open in php.exe"
Thank you in advance.
This error can occur if you maintain file handles throughout your application. Make sure you are closing the file. Without seeing your PHP code I can't give you a better answer than that.
I have a seriously problem, when i execute my script to generate an excel file, it show a blank page.
This is the code:
http://pastebin.com/fb3KMYHv
The script run on Linux server (Hetzner is the hoster) and if there are a lot of records of my query it doesn't work.
Thanks.
I am writing a windows batch file that needs to execute a PHP file which fetches data from a backend and inserts into mysql database.
Below is the code I used and it is working but it will open the browser.
#ECHO OFF
START http://localhost/test.php
How do I ensure that the browser is not invoked when START is executed? I have tried to put /B at the back but it not working.
I have also tried the following but it is not working at all and nothing gets inserted.
#ECHO OFF
php.exe -f "C:\wamp\www\test.php"
The simple answer is that php files are set up to open with your default browser, when you open the php file, it will open with that browser.
If you want to view the contents of the file in cmd instead you can use
type test.php
A Perl script (which I do not control) appends lines to the end of a text file periodically.
I need my PHP script (which will run as a cron job) to read the lines from that file, process them, and then remove them from the file. But, it seems like the only way to remove a line from a file with PHP is to read the file into a variable, remove the one line, truncate the file, and then rewrite the file.
But what happens if:
PHP reads the file
The Perl Script appends a new line.
The PHP script writes the modified buffer back over the file.
In that case the new line would be lost because it would be overwritten when the PHP script finishes and updates the file.
Is there a way to lock a file using PHP in a way that Perl will respect? It looks like the flock() function is PHP specific.
Do you have any freedom to change the design? Is removing the processed lines from the file an essential part of your processing?
If you have that freedom how about letting the perl-produced file grow. Presumably the authors of the perl script have some kind of housekeeping in mind already? Maintaining your own "log" of what you have processed. Then when your script starts up it reads the perl file upto the point recorded in your "log". Process a record, update the log.
If the Perl script, which you cannot control, already implements file locking via flock, you are fine. If it doesn't (and I'm afraid that we have to assume that), you are out of luck.
Another possibility would be to instead of having the perl script write to a file, let it write to a named pipe and have your php script read out directly on the other end and let it write to a real file.
Maybe you instead of working on the same file could let your php script work on a copy? I imagine it could work with three files:
File written to by perl script
A copy of file 1
A processed version of file 2
Then when your php script starts, it checks if file 1 is newer than file 2, and if so makes a new copy, processes this (possibly skipping the number of lines already processed previously) and writes this to file 3.