accessing $_SESSION when using file_get_contents in PHP [duplicate] - php

This question already has answers here:
How to send cookies with file_get_contents
(4 answers)
Closed 2 years ago.
I have a page called send.email.php which sends an email - pretty simple stuff - I pass an order id, it creates job request and sends it out. This works fine when used in the context I developed it (Use javascript to make an AJAX call to the URL and pass the order_id as a query parameter)
I am now trying to reuse the exact same page in another application however I am calling it using php file_get_contents($base_url.'admin/send.email.php?order_id='.$order_id). When I call the page this way, the $_SESSION array is empty isempty() = 1.
Is this because I am initiating a new session using file_get_contents and the values I stored in the $_SESSION on login are not available to me within there?
-->
Thanks for the feedback. It makes sense that the new call doesn't have access to the existing session...
New problem though:
I now get: failed to open stream: HTTP request failed! When trying to execute:
$opts = array('http' => array('header'=> 'Cookie: ' . $_SERVER['HTTP_COOKIE']."\r\n"));
$context = stream_context_create($opts);
$contents = file_get_contents($base_url.'admin/send.sms.php?order_id='.order_id, false, $context);
YET, the URL works fine if I call it as: (It just doesn't let me access session)
$result file_get_contents($base_url.'admin/send.sms.php?order_id='.$order_id);

It is because the server uses cookies to track clients. When you call the page from your browser the session cookie is passed along the request. When you use file_get_contents function, no cookie is passed and the server cannot identify the client. Here's a post that might help you.

Related

Send multiple http request at same time in php

I am trying to get page meta tags and description from given url .
I have url array that I have to loop through to send curl get request and get each page meta, this takes a lot of time to process .
Is there any way to process all urls simultaneuosly at same time?
I mean send request to all urls at same time and then receive
response as soon as request is completed respectively.
For this purpose I have used
curl_multi_init()
but its not working as expected. I have used this example
Simultaneuos HTTP requests in PHP with cURL
I have also used GuzzleHttp example
Concurrent HTTP requests without opening too many connections
my code
$urlData = [
'http://youtube.com',
'http://dailymotion.com',
'http://php.net'
];
foreach ($urlData as $url) {
$promises[] = $this->client->requestAsync('GET', $url);
}
Promise\all($promises)->then(function (array $responses) {
foreach ($responses as $response) {
$htmlData = $response->getBody();
dump($profile);
}
})->wait();
But I got this error
Call to undefined function GuzzleHttp\Promise\Promise\all()
I am using Guzzle 6 and Promises 1.3
I need a solution whether it is in curl or in guzzle to send simultaneous request to save time .
Check your use statements. You probably have a mistake there, because correct name is GuzzleHttp\Promise\all(). Maybe you forgot use GuzzleHttp\Promise as Promise.
Otherwise the code is correct and should work. Also check that you have cURL extension enabled in PHP, so Guzzle will use it as the backend. It's probably there already, but worth to check ;)

Callback function

So in JavaScript, I used to be able to have an http request initiate a callback when AJAX sent a response back to some data I sent to the server, successfully being a callback function. I'm now experimenting with the OAuth2 gem for Ruby, and I'm finding callbacks to not be the same;
I have a web server and facebook app set up, and I have a small php script that writes the current URL (including the auth code, for example) to a file, no problem. All the settings in the facebook app are set up, and if I put this in the URL in the browser:
http://graph.facebook.com/oauth/authorize?client_id=[my_client_id]&redirect_uri=http://localhost/oauth/callback/index.php
It redirects successfully to that script, which then writes the authorization code to a file which I can then use to get the access token. Problem is that I can only do this process manually; using the Net::HTTP.get(URI(address)) command in ruby doesn't seem to initiate the php script.
Ayone have any ideas?
I have no idea why you posted your history with javascript ajax requests, as it has no bearing on your ruby script, which by the way doesn't even use a callback method/function. Using a callback function just means you are calling some function and passing it another function as an argument. When I started programming, the term callback function was very confusing to me, and in my opinion the term should be dropped from the lingo.
As for your ruby script, you need to use something like Firebug to look at the request headers that are being sent by your browser to the server when you manually enter the url in your browser. If you use those same headers in your ruby script, then it should work, e.g.:
req['header1'] = 'hello'
req['header2'] = '10'
or:
headers = {
'header1' => 'hello',
'header2' => '10',
...
}
req = Net::HTTP::Get.new(uri.request_uri, headers)
http = Net::HTTP.new(uri.host, uri.port)
resp = http.request(req)
It's possible that you have a cookie set in your browser, which your browser automatically adds to the request headers when it sends the request to the server. Your browser probably adds thousands of headers to the request--many of which will have no bearing on your problem. If you have the patience, you can try to figure out which header is causing your ruby script's request to malfunction.
Another option is to use the mechanize gem, which will automatically handle cookies and redirects for requests sent by ruby scripts:
http://docs.seattlerb.org/mechanize/GUIDE_rdoc.html
(Read the section Let's Fetch a Page; Don't use the line require 'rubygems' if you are using ruby 1.9+).

Grab xml data from website with password [duplicate]

This question already has answers here:
How do I make a request using HTTP basic authentication with PHP curl?
(11 answers)
Closed 8 years ago.
I want to grab data from a website say www.example.com/stations with XML output:
<stations>
<station>
<name>Loppersum</name>
<code>LP</code>
<country>NL</country>
<lat>53.334713</lat>
<long>6.7472625</long>
<alias>false</alias>
</station>
<station>
<name>Ludinghausen</name>
<code>ELDH</code>
<country>D</country>
<lat>51.76184</lat>
<long>7.43165</long>
<alias>true</alias>
</station>
</stations>
But the url is protected by a password and username (I have that).
I thought that I can use the cURL function, but i never used it before.
Can I store the data also as a object?
EDIT:
It is a HTTP Authorization and I use PHP
You didn't specify what kind of login scheme is in use.
If you're up against HTTP authorization, you can simply use the -u argument with curl. See this answer: Using cURL with a username and password?
If you're up against cookie authorization, it gets a bit more complicated. You'll most likely need to act as a web browser and "login" to the website, and then perform your request. Both requests will need access to a cookie jar/file that you provide to curl.
Edit:
The author indicated that this is HTTP authorization using PHP.
The solution would be to use PHP's SimpleXMLElement to get the XML object. You can use Curl to download the XML data and pass it into the constructor, or you can have SimpleXMLElement do it for you.
Try this:
$user = 'someuser';
$pass = 'somepass';
$url = "http://$someuser:$somepass#example.com/stations";
$obj = new SimpleXMLElement($url, NULL, TRUE);
echo $obj->movie[0]->title; // example
Hope that helps.

accessing $_SESSION when using file_get_contents in PHP

I have a page called send.email.php which sends an email - pretty simple stuff - I pass an order id, it creates job request and sends it out. This works fine when used in the context I developed it (Use javascript to make an AJAX call to the URL and pass the order_id as a query parameter)
I am now trying to reuse the exact same page in another application however I am calling it using php file_get_contents($base_url.'admin/send.email.php?order_id='.$order_id). When I call the page this way, the $_SESSION array is empty isempty() = 1.
Is this because I am initiating a new session using file_get_contents and the values I stored in the $_SESSION on login are not available to me within there?
--> Thanks for the feedback. It makes sense that the new call doesn't have access to the existing session...
New problem though:
I now get: failed to open stream: HTTP request failed! When trying to execute:
$opts = array('http' => array('header'=> 'Cookie: ' . $_SERVER['HTTP_COOKIE']."\r\n"));
$context = stream_context_create($opts);
$contents = file_get_contents($base_url.'admin/send.sms.php?order_id='.order_id, false, $context);
YET, the URL works fine if I call it as: (It just doesn't let me access session)
$result file_get_contents($base_url.'admin/send.sms.php?order_id='.$order_id);
file_get_contents() shouldn't be used anywhere you need authentication/session information transmitted. It's not going to send any cookies, so the user's authentication information will not be included by default.
You can kind of hack around it by including the session identifier (e.g. 'PHPSESSID' by default) as a query parameter in the URL, and have the other script check for that. But transmitting session identifiers in the URL is horribly bad practice, even if it's just to the same server.
$contents = file_get_contents("http://.... /send_sms.php?order_id=$order_id&" . session_name() . '=' . session_id());
To do this properly, use CURL and build a full HTTP request, including the cookie information of the parent page.
you'd have to include the file or call the respective functions from your send.sms.php script. instead you call it like a webservice (which it isn't)

How to send cookies with file_get_contents

I'm trying to get the contents from another file with file_get_contents (don't ask why).
I have two files: test1.php and test2.php. test1.php returns a string, bases on the user that is logged in.
test2.php tries to get the contents of test1.php and is being executed by the browser, thus getting the cookies.
To send the cookies with file_get_contents, I create a streaming context:
$opts = array('http' => array('header'=> 'Cookie: ' . $_SERVER['HTTP_COOKIE']."\r\n"))`;
I'm retrieving the contents with:
$contents = file_get_contents("http://www.example.com/test1.php", false, $opts);
But now I get the error:
Warning: file_get_contents(http://www.example.com/test1.php) [function.file-get-contents]: failed to open stream: HTTP request failed! HTTP/1.1 404 Not Found
Does somebody knows what I'm doing wrong here?
edit:
forgot to mention: Without the streaming_context, the page just loads. But without the cookies I don't get the info I need.
First, this is probably just a typo in your question, but the third arguments to file_get_contents() needs to be your streaming context, NOT the array of options. I ran a quick test with something like this, and everything worked as expected
$opts = array('http' => array('header'=> 'Cookie: ' . $_SERVER['HTTP_COOKIE']."\r\n"));
$context = stream_context_create($opts);
$contents = file_get_contents('http://example.com/test1.txt', false, $context);
echo $contents;
The error indicates the server is returning a 404. Try fetching the URL from the machine PHP is running on and not from your workstation/desktop/laptop. It may be that your web server is having trouble reaching the site, your local machine has a cached copy, or some other network screwiness.
Be sure you repeat your exact request when running this test, including the cookie you're sending (command line curl is good for this). It's entirely possible that the page in question may load fine in a browser without the cookie, but when you send the cookie the site actually is returning a 404.
Make sure that $_SERVER['HTTP_COOKIE'] has the raw cookie you think it does.
If you're screen scraping, download Firefox and a copy of the LiveHTTPHeaders extension. Perform all the necessary steps to reach whatever page it is you want in Firefox. Then, using the output from LiveHTTPHeaders, recreate the exact same request requence. Include every header, not just the cookies.
Finally, PHP Curl exists for a reason. If at all possible, (I'm not asking!) use it instead. :)
Just to share this information.
When using session_start(), the session file is lock by PHP. Thus the actual script is the only script that can access the session file. If you try to access it via fsockopen() or file_get_contents() you can wait a long time since you try to open a file that has been locked.
One way to solve this problem is to use the session_write_close() to unlock the file and relock it after with session_start().
Example:
<?php
$opts = array('http' => array('header'=> 'Cookie: ' . $_SERVER['HTTP_COOKIE']."\r\n"));
$context = stream_context_create($opts);
session_write_close(); // unlock the file
$contents = file_get_contents('http://120.0.0.1/controler.php?c=test_session', false, $context);
session_start(); // Lock the file
echo $contents;
?>
Since file_get_contents() is a blocking function, both script won't be in concurrency while trying to modify the session file.
But i'm sure this is not the best manner to manipulate session with an extend connection.
Btw: it's faster than cURL and fsockopen()
Let me know if you find something better.
Just out of curiosity, are you attempting file_get_contents on a page that has a space in it? I remember trying to use fgc on a URL that had a space in the name and while my web browser parsed it just fine, fgc didn't. I ended up having to use a str_replace to replace ' ' with '%20'.
I would think that this should have been relatively easy to spot that though as it would report only half of the filename. Also, I noticed in one of these posts, someone used \r\n while defining the headers. Keep in mind that PHP doesn't like these to be in single quotes, but they work fine in double.
Make sure that file1.php exists on the server. Try opening it in your own browser to make sure!

Categories