My Domain Host only allows 2 CRON jobs to be set.
I have one already set to download a file over FTP
I need to run 10 more Cron Links from a single PHP file.
Is this at all possible?
Format of Cron Link (10 Parts):
https://www.example.com?route=extension/module/import&import_id=1&part=1_10
Not sure how to test this out in a single PHP file
Would something like this work?
<?php
ini_set('display_errors', 1);
$response=curl_request("https://www.example.com/?route=extension/module/import&import_id=1&part=1_10");
curl_request("https://www.example.com/?route=extension/module/import&import_id=1&part=2_10");
curl_request("https://www.example.com/?route=extension/module/import&import_id=1&part=3_10");
curl_request("https://www.example.com/?route=extension/module/import&import_id=1&part=4_10");
curl_request("https://www.example.com/?route=extension/module/import&import_id=1&part=5_10");
curl_request("https://www.example.com/?route=extension/module/import&import_id=1&part=6_10");
curl_request("https://www.example.com/?route=extension/module/import&import_id=1&part=7_10");
curl_request("https://www.example.com/?route=extension/module/import&import_id=1&part=8_10");
curl_request("https://www.example.com/?route=extension/module/import&import_id=1&part=9_10");
curl_request("https://www.example.com/?route=extension/module/import&import_id=1&part=10_10");
function curl_request($url,$method="GET",$postFields="")
{
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
if($method=="POST")
{
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_POSTFIELDS, $postFields);
}
else
{
curl_setopt($ch, CURLOPT_HTTPGET, TRUE);
}
$response = curl_exec($ch);
echo "$response";
return $response;
}
?>
Is it possible to add a time out between the links?
As I understand you want to run scripts periodically, but your web host only allows two cron jobs.
One solution is that you can run a single Php script as a cron job. Within this script, you can check the current time. If the time is correct, then you can run the other scripts. For example run the main Php script every 10 min. This script can check if it is time to run the other Php scripts.
Include all the cron files in single file and add that newly created file to cron As,
newly_created_file.php
<?php
ob_start();
require_once("filepath/filename_1.php");
require_once("filepath/filename_2.php");
require_once("filepath/filename_3.php");
require_once("filepath/filename_4.php");
?>
Related
guys.
I'm with serious trouble trying to solve this.
The scenario:
Here at work we use the Vulnerability Management tool QualysGuard.
Skipping all technical details, this tool basically detects vulnerabilities in all servers and for each vulnerability in each server it creates a Ticket Number.
From the UI I can access all these tickets and download a CSV file with all of them.
The other way of doing it is by using the API.
The API uses some cURL calls to access the database and retrieve the info that I specify in the parameters.
The method:
I'm using a script like this to get the data:
<?php
$username="myUserName";
$password="myPassword";
$proxy= "myProxy";
$proxyauth = 'myProxyUser:myProxyPassword';
$url="https://qualysapi.qualys.com/msp/ticket_list.php?"; //This is the official script, provided by Qualys, for doing this task.
$postdata = "show_vuln_details=0&SINCE_TICKET_NUMBER=1&CURRENT_STATE=Open&ASSET_GROUPS=All";
$ch = curl_init();
curl_setopt ($ch, CURLOPT_URL, $url);
curl_setopt ($ch, CURLOPT_SSL_VERIFYPEER, FALSE);
curl_setopt($ch, CURLOPT_PROXY, $proxy);
curl_setopt($ch, CURLOPT_PROXYUSERPWD, $proxyauth);
curl_setopt ($ch, CURLOPT_TIMEOUT, 60);
curl_setopt ($ch, CURLOPT_FOLLOWLOCATION, 0);
curl_setopt ($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt ($ch, CURLOPT_REFERER, $url);
curl_setopt($ch, CURLOPT_USERPWD, $username . ":" . $password);
curl_setopt ($ch, CURLOPT_POSTFIELDS, $postdata);
curl_setopt ($ch, CURLOPT_POST, 1);
$result = curl_exec ($ch);
$xml = simplexml_load_string($result);
?>
The script above works fine. It connects to the API, pass some parameters to it and the ticket_list.php file generates an XML file with all I need.
The Problems:
1-) This script only allows a limit of 1000 results in the XML file it returns.
If my request has generated more than 1000 results, the script creates a TAG like this, at the end of the XML:
<TRUNCATION last="5066">Truncated after 1000 records</TRUNCATION>
In this case, I would need to execute anoter cURL call, with the parameters bellow:
$postdata = "show_vuln_details=0&SINCE_TICKET_NUMBER=5066&CURRENT_STATE=Open&ASSET_GROUPS=All";
2-) There are approximately 300,000 tickets in Qualys' database (cloud), and I need to download all of them and insert in MY database, which is used by an application that I'm creating. This application has some forms, which are filled by the user and a bunch of queries are run against the database.
The doubt:
What would be the best way for me to do the task above?
I've got some ideas, but I'm at a complete loss.
I thought:
**1-)**Create a function that does the call above, parses the xml and if the tag
TRUNCATION exists, it gets its value and call itself again, doing it recursively until a result without the tag TRUNCATIONcomes.
The problem with this one is that I weren't able to merge the XML results of each call, and I'm not sure if it would cause memory issues, since it would be needed nearly 300 cURL calls. This script would be executed automatically by using the server's cronTab in a non-business period.
2-) Instead of retrieving all the data, I make the forms that I've mentioned post the data to the script and make the cURL calls with the parameters that the user POSTed. But again I'm not sure if that would be good, since I would still need to do multiple calls, depending on the parameters that the user sends.
3-) This is a crazy one: Use some sort of Macro software to record me while I log in the UI, go to the page where the tickets are located, click the download button, check the CSV option and click to download again. Then, export this script to some language like python or java, create a task in the cronTab and create a script that parses the CSV downloaded and inserts the data to the database. (Crazy or not? =P )
Any help is very welcome, maybe the answer is right before my eyes and I haven't gotten yet.
Thanks in advance!
I believe the proper way would involve a queue worker, however, If I were you I'd make your script grab 5 of these XML files in one execution- grab 1, insert rows, remove from memory, repeat. Then, I'd test it by running it a few times manually to see what sort of execution time and memory it requires. Once you've got a good idea of the execution time and you can see memory will not be a problem, schedule a cron for a little under double that time. If all goes well it should be about a minute between runs and you can have it all in your DB within an hour.
I am new to PHP. I have a block of code on my webpage which I want to execute asynchronously. This block has following :
1. A shell_exec command.
2. A ftp_get_content.
3. Two image resize.
4. One call to mysql for insert.
Is there way make this block async, So that the rest of the page loads quickly.
Please ask if any more details required.
One possible solution is to use curl to do a pseudo async call. You can put the async part of your code in a separate php file and call it via curl. For example:
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'YOUR_URL_WITH_ASYNC_CODE');
curl_setopt($ch, CURLOPT_FRESH_CONNECT, true);
curl_setopt($ch, CURLOPT_TIMEOUT_MS, 1);
curl_exec($ch);
curl_close($ch);
You could put the 4 tasks into a queue, maybe something like Beanstalkd, then have a background worker process this queue.
E.g
<?php
//GetParameters here
//send response/end connection
//keep executing the script with the retrieved parameters.
You could do this, it just might take some tinkering. Instead of trying to close the connection on the first script, you need to process the data with a different script.
<?php
//Get Parameters
//Send output to user
//now use curl to access other script
$post = http_build_query($_POST); // you can replace this with a processed array
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "http://www.example.com/otherscript.php");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 0);
curl_setopt($ch, CURLOPT_POST, true);
curl_setopt($ch, CURLOPT_POSTFIELDS, $post);
curl_exec($ch);
curl_close($ch);
?>
otherscript.php
<?php
header( "Connection: Close" );
// do your processing
?>
Just to explain, when curl connects it gets a connection closed header so curl quits. Meanwhile the "otherscript" is processing the data with no open connections.
I'm pretty sure using exec() may also be an option. You could simply call otherscript using php on the command line passing the variables as cmd line arguments. Something like this should work for you if you are running linux:
exec("nohup php -f otherscript.php -- '$arg1' '$arg2' < /dev/null &");
Now otherscript.php is running in the background under a different process id
I want to run my php script for every 5 minutes. Here is my PHP code.
function call_remote_file($url){
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_exec($ch);
curl_close($ch);
}
set_time_limit(0);
$root='http://mywebsiteurl'; //remote location of the invoking and the working script
$url=$root."invoker.php";
$workurl=$root."script.php";
call_remote_file($workurl);//call working script
sleep(60*5);// wait for 300 seconds.
call_remote_file($url); //call again this script
I run this code once. It works perfectly, even after i close the entire browser window.
The problem is the stops working if i turn of my system's internet connect.
How to solve this problem. Please help me out.
While I wouldn't really recommend doing this for something critical (you're going to have stability issues), this could work:
function call_remote_file($url){
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_exec($ch);
curl_close($ch);
}
set_time_limit(0);
$root='http://mywebsiteurl'; //remote location of the invoking and the working script
$url=$root."invoker.php";
$workurl=$root."script.php";
while(true)
{
call_remote_file($workurl);//call working script
sleep(60*5);// wait for 300 seconds.
}
Another way would be to call it from the command line using exec():
function call_remote_file($url){
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_exec($ch);
curl_close($ch);
}
set_time_limit(0);
$root='http://mywebsiteurl'; //remote location of the invoking and the working script
$url=$root."invoker.php";
$workurl=$root."script.php";
call_remote_file($workurl);//call working script
sleep(60*5);// wait for 300 seconds.
exec('php ' . $_SERVER['SCRIPT_FILENAME']);
You should really use cron though if at all possible.
The above code is ok but if you want to add multiple scripts to run at different intervals then the coding becomes far more complicated.
If you try phpjobscheduler (open source so free to use) it provides an interface to add, modify and remove scripts to run.
Is there any way to get the PID of the process spawned by a curl call? Here's a quick curl call example in foo.php:
<?php
$ch = curl_init();
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_URL, "http://www.foobar.com/bar.php");
$contents = curl_exec($ch);
curl_close($ch);
?>
And I'd like the PID of the bar.php process for use in foo.php. My instincts say there's no way, but figured I'd see if anyone has tried something like that.
If it helps, foo.php and bar.php exist on the same server.
A call to curl does not spawn a new process, it uses libcurl to make the call from within PHP. For functions related to getting PIDs and such, see the manual section for POSIX Functions. In particular, you may be interested in posix_getpid or getmypid. Have bar.php find its own PID, and pass it to foo.php.