run php script without output to browser - php

I have a VERY labor intensive PHP script, which does several api calls to a server elsewhere.
I need to run this script to keep certain data on my server, synchronized with data on the remote server.
I want this script to start every time a specific type of user visits a specific page.
My problem is however, if a user that is qualified goes to this page, page load-time is REDONCULOUS, even though the data the script processes, doesn't effect the page itself in any way.
So, what i was wondering is, how can i run this script using the same conditions, but run it only on my server?
In other words, how can i run this script and stop the browser from waiting for its output?
EDIT: useful information: Using XAMPP for Windows, PHP 5.5, Apache 2.4.
EDIT 2: Using curl seems to be the best option, but it doesn't want to actually run my script.
Here's the call:
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://localhost/tool/follow/PriceTableUpdate.php');
curl_setopt($ch, CURLOPT_FRESH_CONNECT, true);
curl_setopt($ch, CURLOPT_TIMEOUT_MS, 1);
curl_exec($ch);
And here is the actual PriceTableUpdate.php:
<?php
ini_set('max_execution_time', 3600);
$marketData=simplexml_load_file("http://www.eve-markets.net/api/xml?listid=20158key=JKgLjgYvlY6nP");
foreach ($marketData->marketList->type as $type) {
$ItemID = (int)$type['id'];
$ItemBuy = $type->buy->price;
$ItemSell = $type->sell->price;
$ItemMedian = $type->median->price;
mysqli_query($con,"UPDATE piprices SET `ItemBuyPrice` = $ItemBuy, `ItemSellPrice` = $ItemSell, `ItemMedianPrice` =$ItemMedian WHERE `piprices`.`ItemID` = $ItemID");
}
?>
EDIT 3:
Using the above DOES work, in case anyone ever wants to ask this question again.
You have to remember though, that since you are using curl, the php file no longer uses variables you've set before, so you will need to define your database connection in the php file again.

Why not using AJAX in this? When the page loads and meets your specific conditions, make an AJAX request to the server and start the script without waiting for a response back to the browser.

You can probably make a separate call to your php script with the onLoad event - that is, you wait until the page is loaded, then call this other script "in the background". The latter can be done with the following lines of code (I found this by following a link http://www.paul-norman.co.uk/2009/06/asynchronous-curl-requests/ posted by #Gavriel in a comment to one of the other answers):
<?php
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'http://www.yoursite.com/background-script.php');
curl_setopt($ch, CURLOPT_FRESH_CONNECT, true);
curl_setopt($ch, CURLOPT_TIMEOUT_MS, 1);
curl_exec($ch);
curl_close($ch);
?>
Adding this code anywhere in your page should cause the script to be executed without delay for the page load - you won't even need to use a onLoad event in that case.

If I understand what you want to do, then one possible solution is to run the other php in another thread. In php you can do this by calling it via curl: http://php.net/curl

You should separate the browser-request from the background-data-request.
When the user does the access, then create an item in a message-queue server. put required data within that message. let the queue executed from the same/different machine.
if you are not split the request, you webserver worker process stays alive until php has full executed you script. the also say, blocking the browser.
you can "flush" the current output, but server still wait until php is ready to close the browser connection.

Try to use exec() function
If your server is a unix-like os, terminating your command with & will launch the command and won't wait the end of it
exec('./mylongcommand &');
Never tried this, but it should work...

Have a long running, background, process that processes jobs from a queue; something similar to beanstalkd.
When this process comes across a job named, for example, 'sync.schizz' it will start your sync . Now you just need to pop a job into the queue when your special visitor swings by - which will be lightning fast.

There are a couple ways. One, you can fork the execution:
<?php
# only works if pcntl_fork is installed on POSIX system
# will NOT work on windows
# returns new child id
$pid = pcntl_fork();
if ($pid == -1) {
die("could not fork\n");
} else if ($pid) {
$my_pid = getmypid();
print " I am the parent. pid = $my_pid \n";
pcntl_wait($status); //Protect against Zombie children
} else { # child pid is 0 in child
$my_pid = getmypid();
print " I am the child. pid = $my_pid \n";
}
# example output
# I am the parent. pid = 23412
# I am the child. pid = 23414
Or, fork the process at the OS system level (with command &, assuming you are running PHP on linux/unix). So, PHP may be used to execute a shell script.
Also, some people have suggested Ajax, though you need to be careful to consider the case where multiple scripts can be fired off at the same time. What will this do to load on the server, and resource locking? There probably also needs to be some locking logic to ensure only one script is executing at a time, along with a 'heartbeat' that lets you know if the process is alive or dead.

Using the curl function display in the OP, i got it to work.
I forgot to add mysql connection details, which i didn't need before ( when using include ).

Related

PHP cURL; Wait for API status change before continuing [duplicate]

I work on a somewhat large web application, and the backend is mostly in PHP. There are several places in the code where I need to complete some task, but I don't want to make the user wait for the result. For example, when creating a new account, I need to send them a welcome email. But when they hit the 'Finish Registration' button, I don't want to make them wait until the email is actually sent, I just want to start the process, and return a message to the user right away.
Up until now, in some places I've been using what feels like a hack with exec(). Basically doing things like:
exec("doTask.php $arg1 $arg2 $arg3 >/dev/null 2>&1 &");
Which appears to work, but I'm wondering if there's a better way. I'm considering writing a system which queues up tasks in a MySQL table, and a separate long-running PHP script that queries that table once a second, and executes any new tasks it finds. This would also have the advantage of letting me split the tasks among several worker machines in the future if I needed to.
Am I re-inventing the wheel? Is there a better solution than the exec() hack or the MySQL queue?
I've used the queuing approach, and it works well as you can defer that processing until your server load is idle, letting you manage your load quite effectively if you can partition off "tasks which aren't urgent" easily.
Rolling your own isn't too tricky, here's a few other options to check out:
GearMan - this answer was written in 2009, and since then GearMan looks a popular option, see comments below.
ActiveMQ if you want a full blown open source message queue.
ZeroMQ - this is a pretty cool socket library which makes it easy to write distributed code without having to worry too much about the socket programming itself. You could use it for message queuing on a single host - you would simply have your webapp push something to a queue that a continuously running console app would consume at the next suitable opportunity
beanstalkd - only found this one while writing this answer, but looks interesting
dropr is a PHP based message queue project, but hasn't been actively maintained since Sep 2010
php-enqueue is a recently (2017) maintained wrapper around a variety of queue systems
Finally, a blog post about using memcached for message queuing
Another, perhaps simpler, approach is to use ignore_user_abort - once you've sent the page to the user, you can do your final processing without fear of premature termination, though this does have the effect of appearing to prolong the page load from the user perspective.
When you just want to execute one or several HTTP requests without having to wait for the response, there is a simple PHP solution, as well.
In the calling script:
$socketcon = fsockopen($host, 80, $errno, $errstr, 10);
if($socketcon) {
$socketdata = "GET $remote_house/script.php?parameters=... HTTP 1.1\r\nHost: $host\r\nConnection: Close\r\n\r\n";
fwrite($socketcon, $socketdata);
fclose($socketcon);
}
// repeat this with different parameters as often as you like
On the called script.php, you can invoke these PHP functions in the first lines:
ignore_user_abort(true);
set_time_limit(0);
This causes the script to continue running without time limit when the HTTP connection is closed.
Another way to fork processes is via curl. You can set up your internal tasks as a webservice. For example:
http://domain/tasks/t1
http://domain/tasks/t2
Then in your user accessed scripts make calls to the service:
$service->addTask('t1', $data); // post data to URL via curl
Your service can keep track of the queue of tasks with mysql or whatever you like the point is: it's all wrapped up within the service and your script is just consuming URLs. This frees you up to move the service to another machine/server if necessary (ie easily scalable).
Adding http authorization or a custom authorization scheme (like Amazon's web services) lets you open up your tasks to be consumed by other people/services (if you want) and you could take it further and add a monitoring service on top to keep track of queue and task status.
http://domain/queue?task=t1
http://domain/queue?task=t2
http://domain/queue/t1/100931
It does take a bit of set-up work but there are a lot of benefits.
If it just a question of providing expensive tasks, in case of php-fpm is supported, why not to use fastcgi_finish_request() function?
This function flushes all response data to the client and finishes the request. This allows for time consuming tasks to be performed without leaving the connection to the client open.
You don't really use asynchronicity in this way:
Make all your main code first.
Execute fastcgi_finish_request().
Make all heavy stuff.
Once again php-fpm is needed.
I've used Beanstalkd for one project, and planned to again. I've found it to be an excellent way to run asynchronous processes.
A couple of things I've done with it are:
Image resizing - and with a lightly loaded queue passing off to a CLI-based PHP script, resizing large (2mb+) images worked just fine, but trying to resize the same images within a mod_php instance was regularly running into memory-space issues (I limited the PHP process to 32MB, and the resizing took more than that)
near-future checks - beanstalkd has delays available to it (make this job available to run only after X seconds) - so I can fire off 5 or 10 checks for an event, a little later in time
I wrote a Zend-Framework based system to decode a 'nice' url, so for example, to resize an image it would call QueueTask('/image/resize/filename/example.jpg'). The URL was first decoded to an array(module,controller,action,parameters), and then converted to JSON for injection to the queue itself.
A long running cli script then picked up the job from the queue, ran it (via Zend_Router_Simple), and if required, put information into memcached for the website PHP to pick up as required when it was done.
One wrinkle I did also put in was that the cli-script only ran for 50 loops before restarting, but if it did want to restart as planned, it would do so immediately (being run via a bash-script). If there was a problem and I did exit(0) (the default value for exit; or die();) it would first pause for a couple of seconds.
Here is a simple class I coded for my web application. It allows for forking PHP scripts and other scripts. Works on UNIX and Windows.
class BackgroundProcess {
static function open($exec, $cwd = null) {
if (!is_string($cwd)) {
$cwd = #getcwd();
}
#chdir($cwd);
if (strtoupper(substr(PHP_OS, 0, 3)) == 'WIN') {
$WshShell = new COM("WScript.Shell");
$WshShell->CurrentDirectory = str_replace('/', '\\', $cwd);
$WshShell->Run($exec, 0, false);
} else {
exec($exec . " > /dev/null 2>&1 &");
}
}
static function fork($phpScript, $phpExec = null) {
$cwd = dirname($phpScript);
#putenv("PHP_FORCECLI=true");
if (!is_string($phpExec) || !file_exists($phpExec)) {
if (strtoupper(substr(PHP_OS, 0, 3)) == 'WIN') {
$phpExec = str_replace('/', '\\', dirname(ini_get('extension_dir'))) . '\php.exe';
if (#file_exists($phpExec)) {
BackgroundProcess::open(escapeshellarg($phpExec) . " " . escapeshellarg($phpScript), $cwd);
}
} else {
$phpExec = exec("which php-cli");
if ($phpExec[0] != '/') {
$phpExec = exec("which php");
}
if ($phpExec[0] == '/') {
BackgroundProcess::open(escapeshellarg($phpExec) . " " . escapeshellarg($phpScript), $cwd);
}
}
} else {
if (strtoupper(substr(PHP_OS, 0, 3)) == 'WIN') {
$phpExec = str_replace('/', '\\', $phpExec);
}
BackgroundProcess::open(escapeshellarg($phpExec) . " " . escapeshellarg($phpScript), $cwd);
}
}
}
PHP HAS multithreading, its just not enabled by default, there is an extension called pthreads which does exactly that.
You'll need php compiled with ZTS though. (Thread Safe)
Links:
Examples
Another tutorial
pthreads PECL Extension
UPDATE: since PHP 7.2 parallel extension comes into play
Tutorial/Example
reference manual
This is the same method I have been using for a couple of years now and I haven't seen or found anything better. As people have said, PHP is single threaded, so there isn't much else you can do.
I have actually added one extra level to this and that's getting and storing the process id. This allows me to redirect to another page and have the user sit on that page, using AJAX to check if the process is complete (process id no longer exists). This is useful for cases where the length of the script would cause the browser to timeout, but the user needs to wait for that script to complete before the next step. (In my case it was processing large ZIP files with CSV like files that add up to 30 000 records to the database after which the user needs to confirm some information.)
I have also used a similar process for report generation. I'm not sure I'd use "background processing" for something such as an email, unless there is a real problem with a slow SMTP. Instead I might use a table as a queue and then have a process that runs every minute to send the emails within the queue. You would need to be warry of sending emails twice or other similar problems. I would consider a similar queueing process for other tasks as well.
It's a great idea to use cURL as suggested by rojoca.
Here is an example. You can monitor text.txt while the script is running in background:
<?php
function doCurl($begin)
{
echo "Do curl<br />\n";
$url = 'http://'.$_SERVER['SERVER_NAME'].$_SERVER['REQUEST_URI'];
$url = preg_replace('/\?.*/', '', $url);
$url .= '?begin='.$begin;
echo 'URL: '.$url.'<br>';
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$result = curl_exec($ch);
echo 'Result: '.$result.'<br>';
curl_close($ch);
}
if (empty($_GET['begin'])) {
doCurl(1);
}
else {
while (ob_get_level())
ob_end_clean();
header('Connection: close');
ignore_user_abort();
ob_start();
echo 'Connection Closed';
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush();
flush();
$begin = $_GET['begin'];
$fp = fopen("text.txt", "w");
fprintf($fp, "begin: %d\n", $begin);
for ($i = 0; $i < 15; $i++) {
sleep(1);
fprintf($fp, "i: %d\n", $i);
}
fclose($fp);
if ($begin < 10)
doCurl($begin + 1);
}
?>
There is a PHP extension, called Swoole.
Although it might not be enabled, it is available on my hosting for being enabled at click of a button.
Worth checking it out. I haven't had time to use it yet, as I was searching here for info, when I stumbled across it and thought it worth sharing.
Unfortunately PHP does not have any kind of native threading capabilities. So I think in this case you have no choice but to use some kind of custom code to do what you want to do.
If you search around the net for PHP threading stuff, some people have come up with ways to simulate threads on PHP.
If you set the Content-Length HTTP header in your "Thank You For Registering" response, then the browser should close the connection after the specified number of bytes are received. This leaves the server side process running (assuming that ignore_user_abort is set) so it can finish working without making the end user wait.
Of course you will need to calculate the size of your response content before rendering the headers, but that's pretty easy for short responses (write output to a string, call strlen(), call header(), render string).
This approach has the advantage of not forcing you to manage a "front end" queue, and although you may need to do some work on the back end to prevent racing HTTP child processes from stepping on each other, that's something you needed to do already, anyway.
If you don't want the full blown ActiveMQ, I recommend to consider RabbitMQ. RabbitMQ is lightweight messaging that uses the AMQP standard.
I recommend to also look into php-amqplib - a popular AMQP client library to access AMQP based message brokers.
Spawning new processes on the server using exec() or directly on another server using curl doesn't scale all that well at all, if we go for exec you are basically filling your server with long running processes which can be handled by other non web facing servers, and using curl ties up another server unless you build in some sort of load balancing.
I have used Gearman in a few situations and I find it better for this sort of use case. I can use a single job queue server to basically handle queuing of all the jobs needing to be done by the server and spin up worker servers, each of which can run as many instances of the worker process as needed, and scale up the number of worker servers as needed and spin them down when not needed. It also let's me shut down the worker processes entirely when needed and queues the jobs up until the workers come back online.
i think you should try this technique it will help to call as many as pages you like all pages will run at once independently without waiting for each page response as asynchronous.
cornjobpage.php //mainpage
<?php
post_async("http://localhost/projectname/testpage.php", "Keywordname=testValue");
//post_async("http://localhost/projectname/testpage.php", "Keywordname=testValue2");
//post_async("http://localhost/projectname/otherpage.php", "Keywordname=anyValue");
//call as many as pages you like all pages will run at once independently without waiting for each page response as asynchronous.
?>
<?php
/*
* Executes a PHP page asynchronously so the current page does not have to wait for it to finish running.
*
*/
function post_async($url,$params)
{
$post_string = $params;
$parts=parse_url($url);
$fp = fsockopen($parts['host'],
isset($parts['port'])?$parts['port']:80,
$errno, $errstr, 30);
$out = "GET ".$parts['path']."?$post_string"." HTTP/1.1\r\n";//you can use POST instead of GET if you like
$out.= "Host: ".$parts['host']."\r\n";
$out.= "Content-Type: application/x-www-form-urlencoded\r\n";
$out.= "Content-Length: ".strlen($post_string)."\r\n";
$out.= "Connection: Close\r\n\r\n";
fwrite($fp, $out);
fclose($fp);
}
?>
testpage.php
<?
echo $_REQUEST["Keywordname"];//case1 Output > testValue
?>
PS:if you want to send url parameters as loop then follow this answer :https://stackoverflow.com/a/41225209/6295712
PHP is a single-threaded language, so there is no official way to start an asynchronous process with it other than using exec or popen. There is a blog post about that here. Your idea for a queue in MySQL is a good idea as well.
Your specific requirement here is for sending an email to the user. I'm curious as to why you are trying to do that asynchronously since sending an email is a pretty trivial and quick task to perform. I suppose if you are sending tons of email and your ISP is blocking you on suspicion of spamming, that might be one reason to queue, but other than that I can't think of any reason to do it this way.

Running a url scipt inside php (that contains ajax)

I have a site that sends urls via email to my client, once they receive them they click the link, it loads in a browser and completes some Ajax that calls a PHP script. There are several AJAX functions being called in this script.
They have requested that this process be automated so they dont have to click the link and wait approximately 15 minutes each time for all the Ajax to be complete.
Ideally, without recoding the functionality and continuing to use the exact same Ajax I would love to automate this process. So I would like to run a cron that loads a script that calls these URLS instead.
Is this possible?
I have tried several things, but nothing is happening when I load the script. When I mean nothing I mean neither errors, nor the functionality of the script. (I have error reporting turned on).
E.g.
cUrl
init_set('display_errors',1);
error_reporting(E_ALL);
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "http://example.com/myscript.php?varialbe=sample_get_content");
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_exec($ch);
curl_close($ch);
exec
exec('http://example.com/myscript.php');
simply opening the script...
$contents = file_get_contents('http://example.com/myscript.php?varialbe=sample_get_content');
I know that another option is to rebuild the functionality so that Im not using AJAX, but I would prefer not do that as it will take time.
EDIT: The actual script URL itself being called changes due to change in GET variables, so I cannot run it directly via cron (or can I?)
Suggested approach.
In script that sends link, instead of sending a link with unique GET data, have it do this:
exec("./myscript.php $param_1 $param_2");
In myscript.php replace:
$param_1 = $_GET['param_1'];
$param_2 = $_GET['param_2'];
With
$param_1 = $argv[1];
$param_2 = $argv[2];
http://php.net/manual/en/reserved.variables.argv.php
Also add
#!/path/to/phpcgibin -q
to myscript.php before the <? and make sure to upload as ascii

Running an external php code asynchronously

I am building a WebService, using PHP:
Basically,
User sends a request to the server, via HTTP Request. 'request.php', ie.
Server starts php code asynchronously. 'update.php', ie.
The connection with the user is finished.
The code 'update.php' is still running, and will finish after some time.
The code 'update.php' is finished.
The problem is with php running asynchronously some external code.
Is that possible? Is there another way to do it? With shell_exec?
Please, I need insights! An elegant way is preferable.
Thank you!
The best approach is using message queue like RabbitMQ or even simple MySQL table.
Each time you add new task in front controller it goes to queue. Then update.php run by cron job fetch it from queue, process, save results and mark task as finished.
Also it will help you distribute load over time preventing from DoS caused by your own script.
You could have the user connect to update.php, generate some sort of unique ID to keep track of the process, and then call fsockopen() on itself with a special GET variable to signify that it's doing the heavy lifting rather than user interaction. Close that connection immediately, and then print out the appropriate response to the user.
Meanwhile, look for the special GET variable you specified, and when present call ignore_user_abort() and proceed with whatever operations you need in that branch of the if clause. So here's a rough skeleton of what your update.php file would look like:
<?php
if ( isset($_GET['asynch']) ) {
ignore_user_abort();
// check for $_GET['id'] and validate,
// then execute long-running code here
} else {
// generate $id here
$host = $_SERVER['SERVER_NAME'];
$url = "/update.php?asynch&id={$id}";
if ( $handle = fsockopen($host, 80, $n, $s, 5) ) {
$data = "GET {$url} HTTP/1.0\r\nHost: {$host}\r\n\r\n";
fwrite($handle, $data);
fclose($handle);
}
// return a response to the user
echo 'Response goes here';
}
?>
You could build a service with PHP.
Or launch a PHP script using bash : system("php myScript.php param param2 &")
Look into worker processes with Redis resque or gearman

Request a PHP script from another script and move on

I'm building a forum that will allow users to upload images. Images will be stored on my web server temporarily (so I can upload them with a progress bar) before being moved to an S3 bucket. I haven't figured out a brilliant way of doing this, but I think the following makes sense:
Upload image(s) to the web server using XHR with progress bar
Wait for user to submit his post
Unlink images he did not end up including in his post
Call a URL that uploads the remaining images to S3 (and update image URLs in post body when done)
Redirect user to his post in the topic
Now, since step 4 can take a considerable amount of time, I'm looking for a cron like solution, where I can call the S3 upload script in the background and not have the user wait for it to complete.
Ideally, I'm looking for a solution that allows me to request a URL within my framework and pass some image id's in the query, i.e.:
http://mysite.com/utils/move-to-s3/?images=1,2,3
Can I use a cURL for this purpose? Or if it has to be exec(), can I still have it execute a URL (wget?) instead of a PHP script (php-cli)?
Thanks a heap!
PHP's
register_shutdown_function()
is your friend [reference].
The shutdown function keeps running, while your script terminated.
Thus, if everything is available, submit the finale page and exit. The the registered shutdown function continues and performs the time-consuming job.
In my case, I prepared a class CShutdownManager, which allows to register several method to be called after script termination. For example, I use CShutdownManager to delete temporary files no longer needed.
Try the following statement:
shell_exec(php scriptname);
I found the solution to this problem which I'm happy to share. I actually found it on SO, but it needed some tweaking. Here goes:
The solution requires either exec() or shell_exec(). It doesn't really matter which one you use, since all output will be discarded anyway. I chose exec().
Since I am using MAMP, rather than a system-level PHP install, I needed to point to the PHP binary inside the MAMP package. (This actually made a difference.) I decided to define this path in a constant named PHP_BIN, so I can set a different path for local and live environments. In this case:
define(PHP_BIN, '/Applications/MAMP/bin/php/php5.3.6/bin/php');
Ideally, I wanted to execute a script inside my web framework instead of some isolated shell script. I wrote a wrapper script that accepts a URL as an argument and named it my_curl.php:
if(isset($argv[1]))
{
$url = $argv[1];
if(preg_match('/^http(s)?:\/\//', $url))
{
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_exec($ch);
curl_close($ch);
}
}
In this SO question I found the way to execute a shell command in the background. This is necessary because I don't want the user to have to wait until it's finished.
Finally, I run this bit of PHP code to execute the other (or ANY) web request in the background:
exec(PHP_BIN . ' /path/to/my_curl.php http://site.com/url/to/request
&> /dev/null &');
Works like a charm! Hope this helps.

How to execute long running tasks in PHP without cron

Lets imagine my situation (it's fake, of course)...
I have web-site that have 1000 active users. I need to do some stuff that will be slow (database clean-up, backups) etc.. For example, it may take 5 minutes.
I want to accomplish that when user opens my web-site and some params are executed (for example, he is first visitor in one week)... user somehow sends signal to server - that slow process must be run now, but he doesn't wait for it's execution (those several minutes)... he just sees notification that server is making some updates or whatever. Any other user that opens my web-site in that time also sees that notification.
When process is executed - web-site returns to it normal behavior. And it all happens automatically.
Is this possible without cron ?
Check Execution Operators. They may be what you are looking for.
Something like this
$execute = `/path/to/php/script.php`;
// If you needed the output
echo $execute;
Once this script runs, set a flag (could be in the database or a simple write in a file or something). When the script ends, delete this flag.
In all your other pages - check this flag. If it is ON then display the message.
On your page you could have a database value i.e. script_runninng then check if the value = 0, send the command, if = 1 do nothing. Then make sure you set the value to 1 at the beginning and 0 at the end of your command.
You are can use CURL for this task. In database or textfile save time last run script and one day in week run this script (of course, update date before execute script):
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,"http://www.exaple.com/service_script.php");
curl_setopt($ch, CURLOPT_RETURNTRANSFER,1);
curl_setopt($ch, CURLOPT_TIMEOUT, 1);
$result_curl = curl_exec($ch);
curl_close($ch);

Categories