insert a lot of data to database php - php

I want insert about 19000 data to mysql database with php , but with load page or cron job just added about 2000 , how can i insert all data
foreach($inventories as $item){
$this->model->AddInventory($item);
}

cron job and load page both are basically same thing, cron job just hit the url for you after defined interval of time.
in your case there may be the reason of php execution time out (default execution time is 30 seconds), now you have 2 options
increase php max_execution_time in php.ini file
execute your script via command line
I would recommend to use command line, altering max_execution_time is not the right approach

Related

How to exit script if external call exceeds time limit

PHP Has a method called set_time_limit (and a config option max_execution_time), which allows you to specify a time limit which triggers the script to exit when reached. However, this does not take into account time on external calls. Which means that if (for example) the PHP script makes a database call that does something dumb and runs for hours, the set time limit will not trigger.
If I want to guarantee that my PHP script's execution time will not exceed a specific time, is there some way to do this? I'm running the script directly in PHP (rather than via apache or similar), so I'm considering writing a bash script to monitor ps and kill it after a certain time. It occurs to me that I can't be the first person with this problem though, so is there some already-built solution available for this?
Note: This question is NOT about the PHP error from exceeding it's maxed execution time. This is about the case where PHP exceeds a time limit without triggering this error. This question is NOT just about limiting the time of a PHP script, it is about limiting the time allowed to a PHP script making external calls.
When running PHP from the command line the default setting "max_execution_time" is 0.
You can try something similar to solve your problem.
$cmd = "(sleep 10 && kill -9 ".getmypid().") > /dev/null &";
exec($cmd); //issue a command to force kill this process in 10 seconds
$i=0;
while(1){
echo $i++."\n";
sleep(1); //simulating a query
}

Why is Cron job ignoring timeout specified with init_set?

I'm using https://cron-job.org/ for a cron job but I have a problem. When running the script manually (for example, a script that gets data from a csv file), it works but if I run the script through that cron job, it fails because of that 30 seconds max_execution_time I guess.
But the problem is that in my script I'm already using:
ini_set('max_execution_time', 300);
It should be 5 minutes instead of 30 seconds before the cron job fails. What am I doing wrong?
Here is an image with that cron job history:
cron-job.org has some limits:
How and how long does cron-job.org visit my URLs?
cron-job.org visits your URLs at the configured dates/intervals and waits for the URL/script to finish execution. If your URL/script does not finish after 30 seconds, it will timeout and we will close the connection to prevent delays in the execution of the jobs of other users. Our system reads up to 1024 bytes of the output of your URLs/scripts. In case your script sends more data, job execution will be aborted. (Please also see the next question.)
You can read more here: FAQ.

Execute php script on server every 5 minutes

I want to run a php script every 5 minutes that processes some simple mysql queries to identify potential errors and in case an error is recorded as a database entry the php script sends out an email.
From my research, it seems like cron jobs (or task schedulers) usually take care of running scripts at specified times, however I cannot find this option anywhere at the hosting service I am using (who runs "Parallels Plesk Panel 11.0.9" as the management interface I can access).
Therefore I tried the following "trick":
<?php
$active = $_GET["a"];
set_time_limit(0);
while($active == 1){
include 'alert_exe.php';
sleep(300); // execute script every 5 mins
}
?>
To active the script I enter the url (".../alert.php?a=1"). This works fine for a couple of minutes, however it seems after 2 or 3 minutes the script stops executing.
Any idea how I can prevent the stopping of the script or alternative suggestions how to achieve the automatic execution of a script every 5minutes (without being able to access cron jobs)?
Thanks!
It would not surprise me that a hosting service would protect its servers from getting overloaded with long-running scripts, and would configure a time-out after which such scripts are aborted (see PHP: Runtime Configuration Settings and max_execution_time in particular).
If you have a PC that can stay turned on, an alternative solution would be to let it send a request to the server every 5 minutes:
<?php
// Your PHP code that has to be executed every 5 minutes comes here
?>
<script>
setTimeout(function () { window.location.reload(); }, 5*60*1000);
// just show current time stamp to see time of last refresh.
document.write(new Date());
</script>
There is a max_execution_time parameter which stops the script if it takes too long, 30 seconds by default, see the docs - http://php.net/manual/en/info.configuration.php#ini.max-execution-time.
You can try to do set_time_limit(0) at the beginning of your script (see http://php.net/manual/en/function.set-time-limit.php), or change the max_execution_time parameter itself.
But in general, I would not go with such solution, it is not very reliable. Better find a hosting where you can use cron or you can try to look for some external service which will ping your script every 5 minutes (probably you can use services which monitor the web application health).
try this solution:
<?php
$interval=5; //minutes
set_time_limit(0);
while (true)
{
$now=time();
include("alert_exe.php");
sleep($interval*05-(time()-$now));
}
?>
Stop the script by restarting apache, or build in a return value from your internal script which changes while (true) to while (false)

"Maximum execution time exceeded" when processing my queue

I have made a queue using MySQL and PHP. The PHP script first retrieves all the tasks to be done from database and then executes all the tasks one by one using loop. But since there are many tasks and each task require a lot of time, the result is an error of 'Maximum execution time exceeded'.
How can I fix this? Please don't suggest I edit php.ini. I tested this on my browser but the PHP script will be invoked using cron.
You may set this piece of code on the top of your php code:
ini_set('max_execution_time', 0);
Or create in the same folder a .htaccess file with this code in it:
php_value max_execution_time 0

php/apache: allow larger max_execution_time for a CRON job?

I want run a php script weekly using a cron job, however the script may take a few minutes or more.
Is there any way i can allow a greater max_execution_time just for this script?
You don't need to set a higher max_execution_time if you use PHP CLI: http://nl3.php.net/manual/en/features.commandline.differences.php
Maybe you should try these answers:
How do you get a Cronjob executing a PHP script to run longer than 30 seconds.
PHP command line: max_execution_time and memory_limit
But of course, using ini_set("max_execution_time",60) as the first php line in your job's script should do the trick.
Regards, Daniel
You can use set_time_limit(). If you want to disable timeout overall, pass it 0 as an argument. Otherwise pass it the number of seconds of max execution time.
You can use set_time_limit(0) at the start of your code: this removes the execution time limit altogether for this script. Note that it means that the script could run "forever", so put some checks in place in case it hangs.
set_time_limit

Categories