Script stops when error.log rolls over - php

I have PHP Drupal framework running in Apache server, one REST call sent to my Drupal will trigger one block of PHP code. This block of code will keep running for 3 days in backend to analysis data stored in local database and keep writing the analysis result to a text file.
Notice that the block of code is also under Drupal framework, it's not running independently. When the Apache error.log rolls over, this block of code will stop working. To continue my task, I have to send REST call again to trigger it.
When error.log rolls over, the log content will be pushed to file error.log.1, and a new error.log file will be created. My code keeps writing to log, I can't see any abnormal log at the end of error.log.1, the script just literally stopped.
Saying so much, just tried to explain my problem clearly.
Question: How to handle this? Should I change some configuration of Apache server? Or do I have to use independent script which doesn't depend on apache server?

I found this page explaining how apache and system do the log rotation. It shows me the files I need to take care of if I want to resolve my issue.
https://www.digitalocean.com/community/tutorials/how-to-configure-logging-and-log-rotation-in-apache-on-an-ubuntu-vps

Related

Bash grep php error_log Fatal errors to file by filter

I have webserver with several web applications running (PHP). The configuration is set up the way, that php error_log is deleted at midnight (so the file lasts for 24 hrs only).
The thing is, I would like to log all Fatal errors into another file or database but only for specific web applications. (there are about 20 running, 4 of them are mine).
I was thinking about creating a bash script, grepping the error_log file on "Fatal" and url of my applications, fetching the output to file and also remembering last line number of current error_log in separate cache file.
I would then put the script to cron and execute it every few minutes (starting at the last line of the previous run).
The whole idea is a little messed up and I think it could be written efficiently, any ideas?
Writing a cron job seems OK if you can't configure this out of the box. I don't know PHP well enough. In java e.g. you can have the same log message go to several log files depending on criteria.
But I would have your cron job do both the collecting of the fatal errors AND the deletion of the "last day's" log file. This way you can suffice with a single run of this script, at midnight, and save yourself the complexity of knowing where you ended last time (and the chance you missed some errors that happened right before midnight). If the collection was OK, delete the old file, otherwise leave it for diagnosis and retry. It saves you a bunch (24*60) calls to the script.

PHP created infinite loop by accident?

Background Info - I created a online shop a while ago dropshipping products i created the website and added all product info by hand. Now i have knowledge in php i created a scraper/spider to get all the required info i need without doing anything by hand
Question - My script runs on my local server collecting all links from the sites sitemap.xml this is uploaded to my database once this script is complete it starts going through the links extracting the data needed Picture, Price, Name, Desc etc... the site i am scraping is not happy that i am doing it due to human/computer errors that can only be spotted by a human, but have allowed it. anyway my script sometimes throws me an error when a item cannot be scraped due to some unknown reasons so i have put a die() when the script throws this error.
This is placed inside the mysql while loop for the links, i have noticed a few times that when an error does occur the script stops loading shows me the exact error, but when i shut down the browser it carries on deleting queries and extracting information i need to manually restart the server before it stops.
How is this possible and what can i do to prevent this? is it the die() statement just kills the client side script and keeps the server side script running ?
So you are running PHP locally to gather data from a remote site. You start a PHP script in your local browser. And the script does not stop when the browser is closed.
Of course the local server must be stopped.
However I think PHP can also be run from the command line (maybe only Linux?) and then output could go to the command line, and the command line might be simply killed.
Another solution is: in the loops checking for the (non-)existence of a file and then die. A second PHP script, callable in a second browser tab, then adds/removes that signal file.
(The file might serve as a lock too, so you do not start the data gathering twice.)

PHP Script Crashes and Max Execution Time

I currently have a website that has twice been suspended by my hosting provider for "overusing system resources". In each case, there were 300 - 400 crashed copies of one of my PHP scripts left running on the server.
The scripts themselves pull and image from a web camera at home and copy it to the server. They make use of file locks to ensure only one can write at a time. The scripts are called every 3 seconds by any client viewing the page.
Initially I was confused, as I had understood that a PHP script either completes (returning the result), or crashes (returning the internal server error page). I am, however, informed that "defunct scripts" are a very common occurrence.
Would anyone be able to educate me? I have Googled this to death but I cannot see how a script can end up in a crashed state. Would it not time out when it reaches the max execution time?
My hosting provider is using PHP set up as CGI on a Linux platform. I believe that I have actually identified the problem with my script in that I did not realise that flock was a blocking function (and I am not using the LOCK_NB mask). I am assuming that somehow hundreds of copies of my script end up blocked waiting for a resource to become available and this leads to a crash? Does this sound plausible? I am reluctant to re-enable the site for fear of it being suspended again.
Any insights greatly appreciated.
Probably the approach I would recommend is to use tempnam() first and write the contents inside (which may take a while). Once done, you do the file locking, etc.
Not sure if this happens when a PUT request is being done; typically PHP will handle file uploads first before handing over the execution to your script.
Script could crash on these two limitations
max_execution_time
memory_limit
while working with resources, unless you have no other errors in script / check for notice errors too

How do I reload files in PHP while the program is still running?

I'm helping develop a PHP IRC robot and I'd like to know how I could reload the bot's config via an IRC command. Could someone give a sort of basic idea of how to do this task? I was thinking maybe the bot could re-require_once the config file and then restart, but I don't know how to do that (When It runs die(), it stops the entire program so it can't revive itself.)
Your PHP program will likely be event-based. You'll be listening on a port. You'll have a point in your code that will be called whenever someone writes something in the channel.
At this exact point, you can inject the logic that loads the configuration from disk.
You can also have a wrapper around this logic that serves as a time-based cache: it doesn't reload the config every time a post is made - and instead looks it up in memory.

Does PHP run in background when browser is closed?

I start my browser and run a PHP program (in another server) and them I close the browser, the program will still keep running in the server, right?
What if you run the program and them remove the folder in the server (while the program is running). Assuming its a single PHP file, will it crash? Does the whole PHP file is read in memory before running or do the system does periodic access for this file?
draft saved
First off, when the server receives a request, it will continue to process that request until it finishes it's response, even if the browser that made the request closes.
The PHP file call is loaded into memory and processed, so deleting the file in the middle of processing will not cause anything to crash.
If however, half way through your PHP it references another file that is deleted BEFORE that code is reached, then it may crash (based on your error handling).
Note however, that causing PHP to crash will not crash the whole web server.
According to the PHP Connection Handling Page:
http://php.net/manual/en/features.connection-handling.php
You can decide whether or not you want a client disconnect to cause
your script to be aborted. Sometimes it is handy to always have your
scripts run to completion even if there is no remote browser receiving
the output.
Of course you can delete the file or folder which includes the PHP file as long as it is not directly in use/open on the server.
Otherwise you could never delete files on a Webserver as they always might be in use :-)

Categories