I have 7-8 php scripts written which pulls data from remote server and store it into our server. Each script insert/update around 3000-4000 records at a time. When I hit any script from browser it works fine(individual script) but if I try to call all files together by writing header('Location: http://www.example.com/') it gets break. Can anyone suggest me a better way to work with this. Someone suggested me use multi-threading I have not used threading yet so can anyone help me with the better approach/solution. TIA.
Note: your current code doesn't work because header('Location: example.com') redirects the browser to example.com which means your php script finished running and the browser is now on example.com
Solution 1:
if allow_url_fopen is "On" in php.ini you can execute them as using:
<?php
$url1 = file_get_contents('http://www.example.com/1.php');
$url2 = file_get_contents('http://www.example.com/2.php');
?>
and so on...
Solution 2:
function initCURL($url) {
$curl = curl_init();
curl_setopt($curl, CURLOPT_URL, $url);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, true);
curl_setopt($curl, CURLOPT_HEADER, false);
$data = curl_exec($curl);
curl_close($curl);
return $data;
}
use it as follows:
<?php
$url1 = initCURL('http://www.example.com/1.php');
$url2 = initCURL('http://www.example.com/2.php');
?>
in these examples $url1 and $url2 will carry whatever data is returned by the scripts.
Related
Is there anyway you could request a webpage from a server and display the webpage to a user. It would essentially act as a proxy. Here is how it would work:
Client sends the server running the script the website it wants > server fetches website > server displays the website to the client.
Just to clarify the client is never contacting the website, the server running the PHP script is.
So, swiftly cutting to the chase Is it possible? And if so how would you do it?
try this
function get_data($url) {
$ch = curl_init();
$timeout = 5;
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, $timeout);
$data = curl_exec($ch);
curl_close($ch);
return $data;
}
example usage
echo get_data('https://www.somedomain.xyz');
you can also do this via
echo file_get_contents("https://www.somedomain.xyz")
Use this:
$url = $_GET['url'];
if (filter_var($url, FILTER_VALIDATE_URL))
$contents=file_get_contents($url);
So you can be a little protected. Remember that is a risky function
echo file_get_contents("https://www.google.com")
try it.
Currently I have page say page1.php. Now in a certain event, I just want to run another link say http://example.com without actually refreshing this page. The link is a kind of script which updates my database. I tried using shell_exec('php '.$url); where $url='http://example.com' however it showed me an error that could not open file so I suppose shell_exec works only for internal files present on the server. Is there a way to do this directly or I have to go with AJAX? Thanks in advance
Try using curl to send the request to the server with php.
$url = 'http://example.com';
$ch = curl_init();
curl_setopt($ch, CURLOPT_AUTOREFERER, TRUE);
curl_setopt($ch, CURLOPT_NOBODY, TRUE);
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, TRUE);
curl_exec($ch);
curl_close($ch);
Alternatively you could try file_get_contents
file_get_contents('http://example.com');
I would do this front-end and I would use JSONP: much clean and safer IMHO.
I am performing a cURL on an ssl page (Page1.php) that in turn performs a cURL on an SSL page (Page2.php). Both pages are on my site and within the same directory and both pages return XML. Through logging I see that Page2.php is being hit and is outputting valid XML. I can also hit page2.php in a browser and it returns valid XML. However, page1.php is timing and out never returning the XML.
Here is the relevant code from Page1.php:
$url = "https://mysite.com/page2.php"
$c = curl_init($url);
if ($c)
{
curl_setopt($c,CURLOPT_RETURNTRANSFER, true);
curl_setopt($c,CURLOPT_FOLLOWLOCATION, true);
curl_setopt($c,CURLOPT_CAINFO, "cacert.pem");
curl_setopt($c, CURLOPT_SSL_VERIFYPEER, true);
curl_setopt($c, CURLOPT_SSL_VERIFYHOST, 2);
curl_setopt($c,CURLOPT_TIMEOUT, 30);
curl_setopt($c,CURLOPT_FRESH_CONNECT, true);
$result = curl_exec($c);
curl_close($c);
}
$result never has anything in it.
Page2 has similar options set but its $result var does have a the expected data in it.
I'm a bit of a noob when it comes to PHP so I'm hoping that I'm overlooking something really simple here.
BTW, we are using a WAMP setup with Windows Server 2008.
I have done quite a bit of searching and cannot quite find my answer. My problem is that I am trying to call a link with GET variables attached to it and it just hangs and hangs until connection times out. When I just literally call the link in a web browser it works fine no problem.
Here is the fopen() php code example:
<?php
$url = "https://www.mysite.com/folder/second_folder/file.php?varA=val1&varB=val2&varC=val3&varD=val4&varE=val5";
$ch = fopen($url, 'r');
if(!$ch){
echo "could not open!!! $url";
} else {
echo "Success! ($url)";
}
?>
I can call file.php without the GET variables just fine. Returns with no error.
NOTE: I will say that file.php with one of the var's that get passed, does some functions and then does a header Location rewrite. I do not think it is even getting to this point when it does a connect timeout though because when I had problems, I put in a "check point" prior to the header Location point which should email me, and it does not email me.
Again, if I run the URL in a web browser it works just fine.
So what is going on if anyone can help me? I just need to run the URL as if PHP is clicking on the links. I have used fopen before but for some reason it does not work now. Also cURL did not work on this.
Try changing '' to " " in this case.
My working code is
<?php $handle = fopen("c:\\folder\\resource.txt", "r"); ?>
I think you want to be using
$ch = file_get_contents($url);
Edit: cURL option
// open
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch, CURLOPT_MAXREDIRS, 1);
curl_setopt($ch, CURLOPT_FORBID_REUSE, 1);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$page_data = curl_exec($ch);
$page_info = curl_getinfo($ch);
// close
curl_close ($ch);
In my application i have to send the sms to user while registration. But while inserting record in database i want to hit this url in browser.
Can any one suggest how to run this url at backgound
http://www.myurl.com/smpp/sendsms?username=XX&password=XX&to=XX&from=XX&text=Test
Well this depends on what you mean with background I'm asuming however that you mean that the user won't be redirected to that page.
If I were you I'd go with cURL if you have it installed, since the only thing you seem to want to do is make an ordinary request, and maybe, read the response. The code below is untested but should give you a hint.
$req = curl_init();
curl_setopt($req, CURLOPT_URL,"theaddress_and_params");
curl_exec($req);
public function get_url($url)
{
$cmd = "curl --max-time 60 ";
$cmd .= "'" . $url . "'";
$cmd .= " > /dev/null 2>&1 &";
exec($cmd, $output, $exit);
return $exit == 0;
}
it will call curl via cli. it will run in background.
If you did that you would be exposing usernames and passwords in the URL (or headers). Have the user login in advance and use a session variable.
Don't send this from the client side since every user would easily be able to "fake" the data by just loading your URL with some (potentially malicious) parameters. "username" and "password" are not protected at all and I'm sure your service would be down very quickly.
Instead, you could easily do this in the background (server-side) with PHPs curl functions:
http://www.php.net/manual/en/curl.examples-basic.php
$url = 'http://yoursmsgateway.com/WebSMS/SMSAPI.jsp?username='.$smsuser.'&password='.$smspwd.'&sendername='.$smssender.'&mobileno='.$number.'&message='.urlencode($message);
echo $url;
$mystring = get_data($url);
//echo "hi!";
echo $mystring;
function get_data($url) {
$ch = curl_init();
$timeout = 5;
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST,false);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER,false);
curl_setopt($ch, CURLOPT_MAXREDIRS, 10);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, $timeout);
$data = curl_exec($ch);
curl_close($ch);
return $data;
}
You can try this code, But you have to install cUrl DLL file. Process to install CURL is given below:--
1.)open php.ini file
2.)Find this dll file---> ;extension=php_curl.dll
3.)Remove ; (semicolon)
4.)such as---> extension=php_curl.dll
5.)Save it (Ctrl+s)
you could fork a child-process with the pcntl php extension.
(library that implements this: https://github.com/kriswallsmith/spork)
Make an AJAX call as the user hits the submit button. This would cause the script at that URL to run in the background while your current PHP inserts the record in the database.
Ajax? Load that url inside a div with ajax while you save the record calling another php file with ajax.
I think that there is no such concept as multithreading (which in essence is what you are asking for), as everything in a PHP code runs incrementally, but you can get to a solution. See this and this questions and their answers.
The reason that there is no multithreading in PHP is because everything is processed in the server, and you, as a client, already receive a finished response, so "running on background" in PHP is the same as "running sequentially".