php curl_setop url error - php

I have a
$result = curl_exec($ch)
the $ value is like that
HTTP/1.1 302 Found
Date: Wed, 18 Apr 2018 12:45:05 GMT
Set-Cookie: OAMAuthnHintCookie=0#1524055505; httponly; secure; path=/; domain=.test.com
Set-Cookie: OAMRequestContext_test.test.com:443_527635=Rv52rjM82f3htVYzT+Lp0g==;max-age=300; httponly; secure; path=/
Location: https://id.test.com/obrareq.cgi?encquery%3DE6zb4nAIzYfopY8L5SbbJJPLfvrkN7Y1RkKgv4%2FSzBKmT1cY%2BhRn0A3AhCDxGFIB10DLwLMp%2BcR40CHFKhdrh2aZcEck%2Bd2pzikJ3WzWCAo5LiVW8O3CGPVoeFXUBY2orJxN9zSZXNXkAzg%2F%2F2twT%2FS1ZIUlox8fyQrKf6mITSrqbgKhn5dcC5CR79rJDCO75VEIU472JptWmPlBlEkyFT1XRO%2BUzXQHUwui92%2FGCh34PbbDrPajiyU71ycb03ffcCt0Sl1tKVNw2S%2BsUe81VH1jgV8yLWXslvl2SzsqpQUcZVZdi80HEM2ppQTsvECX%2BiyWnZ49nVBxp3YqU4nlhkAIaNaEbTEpPVF%2FvCJSuHo%3D%20agentid%3DWgtest%20ver%3D1%20crmethod%3D2
Content-Length: 676
Cache-Control: max-age=0
Expires: Wed, 18 Apr 2018 12:45:05 GMT
I extract the location: url with:
preg_match('/(Location: https:\/\/id\.test\.com\/obrareq\.cgi\?encquery)(.*)/', $res, $location);
and put the result in $found:
$found=$location[2];
then I want to use this url in a new curl_setop
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,'https://id.test.com/obrareq.cgi?encquery'.$found);
......
but it doesn't give me any result.
If I do the same curl_setop with a manual copy of the url it works.
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,'https://id.test.com/obrareq.cgi?encquery%3DE6zb4nAIzYfopY8L5SbbJJPLfvrkN7Y1RkKgv4%2FSzBKmT1cY%2BhRn0A3AhCDxGFIB10DLwLMp%2BcR40CHFKhdrh2aZcEck%2Bd2pzikJ3WzWCAo5LiVW8O3CGPVoeFXUBY2orJxN9zSZXNXkAzg%2F%2F2twT%2FS1ZIUlox8fyQrKf6mITSrqbgKhn5dcC5CR79rJDCO75VEIU472JptWmPlBlEkyFT1XRO%2BUzXQHUwui92%2FGCh34PbbDrPajiyU71ycb03ffcCt0Sl1tKVNw2S%2BsUe81VH1jgV8yLWXslvl2SzsqpQUcZVZdi80HEM2ppQTsvECX%2BiyWnZ49nVBxp3YqU4nlhkAIaNaEbTEpPVF%2FvCJSuHo%3D%20agentid%3DWgtest%20ver%3D1%20crmethod%3D2y');
Any idea of what i did wrong?
......

Related

curl: keep cookies while being redirected

I use this code (originally found login into webpage with php with cURL) to first grab the csrf token, create cookie, and than use that csrf token and cookie in a subsequent post request. It doesn't work(deduced from how the final webpage looks) and I think it's because the FOLLOWLOCATION is set to true. It must be set to true, because there are some redirections going on, but redirections also bring the consequence of "misplacing" cookies. The question is..how to keep cookies while being redirected as a response from server.
$cookie = 'cookies2.txt';
# Initialize a cURL session.
$ch = curl_init('https://example.com/login');
# Set the cURL options.
$options = [
CURLOPT_COOKIEJAR => $cookie,
CURLOPT_RETURNTRANSFER => 1,
CURLOPT_USERAGENT => $useragent,
CURLOPT_COOKIESESSION => true,
CURLINFO_HEADER_OUT => true,
CURLOPT_HEADER=>1
];
# Set the options
curl_setopt_array($ch, $options);
# Execute
$html = curl_exec($ch);
$request = curl_getinfo($ch, CURLINFO_HEADER_OUT);
echo "1.Request sent: $request<br>";
$headerSizeFirst = curl_getinfo($ch, CURLINFO_HEADER_SIZE);
$headersFirst = substr($html, 0, $headerSizeFirst);
echo "1.Request recieved: $headersFirst";
$dom = pQuery::parseStr($html);
$csrfToken = $dom->query('[name="csrf"]')->val();
$postData = [
'csrf' => $csrfToken,
'username' => $email,
'password' => $password
...........
];
# Convert the post data array to URL encoded string
$postDataStr = http_build_query($postData);
$options[CURLOPT_POST] = 1;
$options[CURLOPT_POSTFIELDS] = $postDataStr;
$options[CURLOPT_HEADER]=1;
$options[CURLOPT_COOKIEJAR]=$cookie;
$options[CURLOPT_FOLLOWLOCATION] = true;
$options[CURLOPT_RETURNTRANSFER] = true;
$options[CURLOPT_USERAGENT] = $useragent;
$options[CURLINFO_HEADER_OUT] => true,
curl_setopt_array($ch, $options);
# Execute
$response = curl_exec($ch);
$request = curl_getinfo($ch, CURLINFO_HEADER_OUT);
echo "2. Request sent: $request<br>";
$headerSize = curl_getinfo($ch, CURLINFO_HEADER_SIZE);
$headers = substr($response, 0, $headerSize);
echo "2. Request recieved: $headers<br>";
echo $response;
/////// HEADER OUT AND IN DATA
1.Request sent: GET /login HTTP/1.1 Host: example.com User-Agent: Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.5; en-US; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3 Accept: */*
1.Request recieved: HTTP/1.1 200 OK Accept-Ranges: bytes Age: 0 Cache-Control: no-cache Content-Type: text/html; charset=UTF-8 Date: Tue, 09 Feb 2016 13:34:02 GMT Server: nginx Set-Cookie: ____ri=4485; expires=Thu, 17-Mar-16 01:34:01 GMT; path=/; domain=.example.com Set-Cookie: PHPSESSID=dk7n4kcrigi54q081tr1evd5a2; path=/; domain=.example.com Set-Cookie: ts1=11e2bb0a86bfb9669c361cc407e1e3b3decefcce; expires=Fri, 06-Feb-2026 13:34:01 GMT; path=/; domain=.example.com Set-Cookie: session=eyJpdiI6Im9cL1wvcWVUUEhDVWY5cnF2QnZzVDRIaElPWitIQmc4RVI3N1M2SGxsY3hUTT0iLCJ2YWx1ZSI6InZjVFBnelNlMmxuaXpVNlpGb2d0TzNoM21NXC85dG43cVpmbUZkQ1VuOHlnU1VOXC9XWDJaWTcxckZ2OWpZY093REd1WmZEd21FQURQNnQ3NDltMmU0NnZMWm9vSE40YmV0VG1WYkpndE9DNTdEUE4wSkp5Q1BWMXJGaFdmMHV5N0NWRmZoVHFnT0JtY1VnYjJ6N29UZkxvQThRWG9KWFV3Q0k4azdibEZvVGwrcFJRaXdaQVV4OXZLaWlDVVpmeVREc3k3cW1MOTdTdHEyY1FlZ2Vuc09WWHFOanNOTjlnK1c0SStuTWhoU0VBWGxsY1ZqRExTblBOWllpeDNqaHhZNSIsIm1hYyI6IjllMWNhMzU3NTVjYTA4NWIwZTQ0NTYxMzA3ZmIzMTAxMWE5NDg0NzZlMTNmZTc2NjhhNDI2M2ZmZDMwMzgwY2QifQ%3D%3D; expires=Fri, 19-Feb-2016 13:34:02 GMT; path=/; domain=.example.com; httponly Vary: Accept-Encoding Vary: Accept-Encoding, Accept-Encoding X-Cache: Miss X-Frame-Options: SAMEORIGIN Content-Length: 51381 Connection: keep-alive
2. Request sent: GET /inx/aeGDrYQ HTTP/1.1 Host: example.com User-Agent: Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.5; en-US; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3 Accept: */* Cookie: PHPSESSID=t762fd0nbi1bp3hrgb9sgc3k20; ____ri=4485; safemode=1; session=eyJpdiI6Im1HQzlNR1JhMTNDc0JRelYyRVwveUp6N0JxZG56Z2p5K094eSs3YU5HQ3dzPSIsInZhbHVlIjoiVXBPYzN4TVNReURhVnMxQlZ1TndLZ0dYUjltbUVEcW11bkJJMDdMRVZoZ0hHMjRXZ2p6azlcL1FWXC93NnZWN3oreDcxQms3aGlcL3l0MG1vTjd1V21FcmVCVzFnQjVuMUY5dHBWeUlTbU9NSjJcL1d5TlwvTW11ZWp1eHpNd3d4eFZTamV6aThsNldkdlN3aFo0XC9sTnVnU0tXVDRKbWVBU25VU0hJaDREQ1J5M2xDXC9zRUc5OXhWMWJWWG9jYndhczYyZW4xMkUxb3BoU3FmQmMrNVdzM3RqQmgzeHY1NVJ5RXRTNGZOdmQ4dTRCbmRtWVZBN210QVVEVk1BNTFPc1NQcFU3bnd4NEpKbnRaTFliRWNzbkZaXC9YWUF1Nld1ekZSbjVGRXBuZzNoRlBNND0iLCJtYWMiOiI4OWEwNmMyZGVkYjFiYTlmNDY0MDE5MTQwNzE1YzNhYWJjYTA5YjI3MWMyZjgwMTViN2MyYmI0OWUyNmMwNjM0In0%3D; toastMsg=2; ts1=11e2bb0a86bfb9669c361cc407e1e3b3decefcce
2. Request recieved: HTTP/1.1 302 Found Accept-Ranges: bytes Age: 0 Cache-Control: no-cache, private Content-Type: text/html; charset=UTF-8 Date: Tue, 09 Feb 2016 13:34:04 GMT location: http://example.com/inx/aeGDrYQ Server: nginx Set-Cookie: PHPSESSID=thrn81mu7584dvp2ek9tpde8f4; expires=Thu, 09-Feb-2017 13:34:03 GMT; path=/; domain=.example.com Set-Cookie: PHPSESSID=t762fd0nbi1bp3hrgb9sgc3k20; expires=Thu, 09-Feb-2017 13:34:03 GMT; path=/; domain=.example.com Set-Cookie: session=eyJpdiI6Im1HQzlNR1JhMTNDc0JRelYyRVwveUp6N0JxZG56Z2p5K094eSs3YU5HQ3dzPSIsInZhbHVlIjoiVXBPYzN4TVNReURhVnMxQlZ1TndLZ0dYUjltbUVEcW11bkJJMDdMRVZoZ0hHMjRXZ2p6azlcL1FWXC93NnZWN3oreDcxQms3aGlcL3l0MG1vTjd1V21FcmVCVzFnQjVuMUY5dHBWeUlTbU9NSjJcL1d5TlwvTW11ZWp1eHpNd3d4eFZTamV6aThsNldkdlN3aFo0XC9sTnVnU0tXVDRKbWVBU25VU0hJaDREQ1J5M2xDXC9zRUc5OXhWMWJWWG9jYndhczYyZW4xMkUxb3BoU3FmQmMrNVdzM3RqQmgzeHY1NVJ5RXRTNGZOdmQ4dTRCbmRtWVZBN210QVVEVk1BNTFPc1NQcFU3bnd4NEpKbnRaTFliRWNzbkZaXC9YWUF1Nld1ekZSbjVGRXBuZzNoRlBNND0iLCJtYWMiOiI4OWEwNmMyZGVkYjFiYTlmNDY0MDE5MTQwNzE1YzNhYWJjYTA5YjI3MWMyZjgwMTViN2MyYmI0OWUyNmMwNjM0In0%3D; expires=Fri, 19-Feb-2016 13:34:04 GMT; path=/; domain=.example.com; httponly Set-Cookie: toastMsg=2; expires=Fri, 08-Feb-2019 13:34:04 GMT; path=/; domain=.example.com Set-Cookie: unverified=deleted; expires=Thu, 01-Jan-1970 00:00:01 GMT; path=/; domain=.example.com; httponly Set-Cookie: safemode=1; expires=Fri, 19-Feb-2016 13:34:04 GMT; path=/; domain=.example.com Set-Cookie: cacheableGrace=deleted; expires=Thu, 01-Jan-1970 00:00:01 GMT; path=/; domain=.example.com; httponly X-Cache: Miss X-Frame-Options: SAMEORIGIN Content-Length: 0 Connection: keep-alive HTTP/1.1 200 OK Age: 0 Cache-Control: max-age=0, public, s-maxage=60 Content-Type: text/html; charset=UTF-8 Date: Tue, 09 Feb 2016 13:34:05 GMT Server: nginx Vary: Accept-Encoding Vary: Accept-Encoding X-Cache: Miss X-Cacheable: Yes X-Frame-Options: SAMEORIGIN transfer-encoding: chunked Connection: keep-alive
CURLOPT_COOKIESESSION set to true means you tell libcurl to treat this as a new (cookie) session and it will discard all session cookies at start of a request. You should probably not set that without being really sure that's what you need as it will flush all cookies without a specific expiry date/time.
Otherwise, when the cookie engine has been activated in libcurl it will keep the cookies associated with the handle and reuse them in subsequent requests done with that same handle.

Extract data from part JSON and part HTTP response

I'm working with an API, using cURL I have received a set of data.
The data appears to be half HTTP request and half JSON. I'm not sure why it's mixed but essentially I get this response when I do a var_dump:
string(873) "HTTP/1.1 200 OK cache-control: no-cache, no-store, must-revalidate, pre-check=0, post-check=0 content-length: 153 content-type: application/json;charset=utf-8 date: Mon, 10 Nov 2014 10:58:49 UTC expires: Tue, 31 Mar 1981 05:00:00 GMT last-modified: Mon, 10 Nov 2014 10:58:49 GMT ml: A pragma: no-cache server: tsa_b set-cookie: guest_id=v1%3A141561712923128379; Domain=.twitter.com; Path=/; Expires=Wed, 09-Nov-2016 10:58:49 UTC status: 200 OK strict-transport-security: max-age=631138519 x-connection-hash: 57175e4dba3d726bebb399072c225958 x-content-type-options: nosniff x-frame-options: SAMEORIGIN x-transaction: 2e4b8e053e615c75 x-ua-compatible: IE=edge,chrome=1 x-xss-protection: 1; mode=block {"token_type":"bearer","access_token":"AAAAAAAAAAAAAAAAAAAAAMVfbQAAAAAAK7qYRQOgdZ771TrJ6pZ7nugCwVQ%3DLKcongtwy3lcBDbPSEreC9DfhJk3Gm7qyQInqhFAxYvo1clv4S"}"
That's the full data back. It's got HTTP info at the beginning and then part JSON at the end.
The only bit I need from this is the access_token data.
If it was just JSON then I could use json_decode to get the access_token out but because it's got all the HTTP info at the beginning json_decode cannot understand it and gives the result NULL.
How can I remove the HTTP part so I can just grab the access_token data?
ETA: my request is made through cURL, so the var I'm dumping out is $response
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,$auth_url);
curl_setopt($ch, CURLOPT_HTTPHEADER, $headers);
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_POSTFIELDS, "grant_type=client_credentials");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$header = curl_setopt($ch, CURLOPT_HEADER, 1);
$result = curl_exec($ch);
curl_close($ch);
The result I receive roughly matches the expected result given in the Twitter documentation so I don't think the data is corrupt/incorrect: https://dev.twitter.com/oauth/reference/post/oauth2/token
Switch of header output and remove
$header = curl_setopt($ch, CURLOPT_HEADER, 1);
or replace with
curl_setopt($ch, CURLOPT_HEADER, false);
$a='HTTP/1.1 200 OK cache-control: no-cache, no-store, must-revalidate, pre-check=0, post-check=0 content-length: 153 content-type: application/json;charset=utf-8 date: Mon, 10 Nov 2014 10:58:49 UTC expires: Tue, 31 Mar 1981 05:00:00 GMT last-modified: Mon, 10 Nov 2014 10:58:49 GMT ml: A pragma: no-cache server: tsa_b set-cookie: guest_id=v1%3A141561712923128379; Domain=.twitter.com; Path=/; Expires=Wed, 09-Nov-2016 10:58:49 UTC status: 200 OK strict-transport-security: max-age=631138519 x-connection-hash: 57175e4dba3d726bebb399072c225958 x-content-type-options: nosniff x-frame-options: SAMEORIGIN x-transaction: 2e4b8e053e615c75 x-ua-compatible: IE=edge,chrome=1 x-xss-protection: 1; mode=block {"token_type":"bearer","access_token":"AAAAAAAAAAAAAAAAAAAAAMVfbQAAAAAAK7qYRQOgdZ771TrJ6pZ7nugCwVQ%3DLKcongtwy3lcBDbPSEreC9DfhJk3Gm7qyQInqhFAxYvo1clv4S"}"';
preg_match("/\{.*\}/",$a,$m);
$ja=json_decode($m[0]);
var_dump($ja,$m);
output:
object(stdClass)[1]
public 'token_type' => string 'bearer' (length=6)
public 'access_token' => string 'AAAAAAAAAAAAAAAAAAAAAMVfbQAAAAAAK7qYRQOgdZ771TrJ6pZ7nugCwVQ%3DLKcongtwy3lcBDbPSEreC9DfhJk3Gm7qyQInqhFAxYvo1clv4S' (length=112)

Using php curl proxy for surfing a website

I want to set up a webpage to use proxy automatically. Here is my script:
<?
$url = 'http://www.sciencedirect.com/science/jrnlallbooks/a/fulltext';
$proxy = '200.93.148.72:3128';
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,$url);
curl_setopt($ch, CURLOPT_PROXY, $proxy);
curl_setopt($ch, CURLOPT_TIMEOUT, 0);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_HEADER, 1);
curl_setopt($ch, CURLOPT_COOKIEJAR, "my_cookies.txt");
curl_setopt($ch, CURLOPT_COOKIEFILE, "my_cookies.txt");
$curl_scraped_page = curl_exec($ch);
echo $curl_scraped_page;
?>
I can open the webpage and the header shows:
HTTP/1.0 200 OK Date: Fri, 23 Nov 2012 18:46:41 GMT Last-Modified: Fri, 23 Nov 2012 18:46:41 GMT Set-Cookie: MIAMISESSION=86e5ecb0-359b-11e2-b116-00000aab0f6c:3531149201; path=/; domain=.sciencedirect.com; Set-Cookie: USER_STATE_COOKIE=; expires=Thu, 01 Jan 1970 23:59:59 GMT; path=/; domain=.sciencedirect.com; Set-Cookie: SD_REMOTEACCESS=; expires=Thu, 01 Jan 1970 23:59:59 GMT; path=/; domain=.sciencedirect.com; Set-Cookie: MIAMIAUTH=6c1869e3dd5d3ca5644fdd359099d551fee57ff4f19a0d9e30c92bcc3f4cdcb755a638c57790c6f118d4a8601a914733e770454895a95214fd2a92b748418c15aabbe7e39dbfe22d18a337761caf3eebb621aa3f17803d29fa1a241d10f4aad71e83423e9562a1ec67194a18c7a016cd36828cdb6ccdaef46d038a2ee15429cd0ee88a636ec51602cee8d34e3397f0c720230f6ab68fbc74c285431372f89886ba1bbbb03f6873e2804f1577f52679f16123dd0c07d70ab0b92145c1c383e4155512e57b8da9452ad570394af0c66b0859739b1e77c2d98372d5a1b978828531f3a042a816bf4a9edbe45d4f9197a685aa1506ae57ec1593efd428842244a96f9d2033b43ccf50a14843907943eb57b7c9dd1bef11603f9e686aad6847870ac6fec520209a31df9efb3d0ee4e24341c4c5dd6c12060a6a624c3ff60ec16286f7cb6c3839f8f375c00c836958eada8d4900baa294fa3645c02f1b3ac78c7bc78bc2d79f5f4e038b6ae465d63f0100a53731ec826eba3c6f8f648bf03d6ac7d450788f0362055ca413073d9333348cacdc6e4d6222a420a78620a968b185954fcc76b3a9a63f2e62f9; path=/; domain=.sciencedirect.com; Set-Cookie: TARGET_URL=fcf74dd786744d87fbaaaf8652a764ab4a79b0d3ed681139e9106923760631052596d348948479933da48b3723069bbf09065290c950dc02c1f0d1436659ad5a; path=/; domain=.sciencedirect.com; Set-Cookie: MIAMIAUTH=6c1869e3dd5d3ca5644fdd359099d551fee57ff4f19a0d9e30c92bcc3f4cdcb755a638c57790c6f118d4a8601a914733e770454895a95214fd2a92b748418c15aabbe7e39dbfe22d18a337761caf3eebb621aa3f17803d29fa1a241d10f4aad71e83423e9562a1ec67194a18c7a016cd36828cdb6ccdaef46d038a2ee15429cd0ee88a636ec51602cee8d34e3397f0c720230f6ab68fbc74c285431372f89886ba1bbbb03f6873e2804f1577f52679f16123dd0c07d70ab0b92145c1c383e4155512e57b8da9452ad570394af0c66b0859739b1e77c2d98372d5a1b978828531f3a042a816bf4a9edbe45d4f9197a685aa1506ae57ec1593efd428842244a96f9d2033b43ccf50a14843907943eb57b7c9dd1bef11603f9e686aad6847870ac6fec520209a31df9efb3d0ee4e24341c4c5dd6c12060a6a624c3ff60ec16286f7cb6c3839f8f375c00c836958eada8d4900baa294fa3645c02f1b3ac78c7bc78bc2d79f5f4e038b6ae465d63f0100a53731ec826eba3c6f8fe5378db869312c80a0addc0d8946f7f6552daa333f2e38da51e23c1d44ae41176c1b2e70f7b144f63a44c25741cd0126; path=/; domain=.sciencedirect.com; Content-Type: text/html Expires: Tue, 01 Jan 1980 05:00:00 GMT X-RE-Ref: 0 19194695 Server: www.sciencedirect.com P3P: CP="IDC DSP LAW ADM DEV TAI PSA PSD IVA IVD CON HIS TEL OUR DEL SAM OTR IND OTC" Vary: Accept-Encoding, User-Agent X-Cache: MISS from alejandria.ufps.edu.co X-Cache-Lookup: HIT from alejandria.ufps.edu.co:3128 Via: 1.0 alejandria.ufps.edu.co (squid/3.0.STABLE15) Proxy-Connection: close
However, the proxy only works on this page. When I click other links on this page, no proxy is loaded. Please help me to solve this. How to improve my script? I want a whole website (all links) to use the proxy. How to set up?
It seems that You need to use regular expressions to modify any links within the HTML code (for example <a href="...">) to point to Your script. Then You have to set up an argument to cURL, so You'll get proper page, so it'll look something like http://YourSite.com/proxy.php?site=http://example.com/smth/foo.php

PHP Curl to get a download filesize

I use this script in two different servers:
function curlGetFileInfo($url, $cookies="default"){
global $_config;
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_NOBODY, true);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_HEADER, true);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_COOKIEFILE, 'serverpath/cookies/'.$cookies.'.txt');
$data = curl_exec($ch);
curl_close($ch);
if ($data === false) {
return 0;
}
//echo $data;
$info['filename'] = get_between($data, 'filename="', '"');
$info['extension'] = end(explode(".",$info['filename']));
if (preg_match('/Content-Length: (\d+)/', $data, $matches)) {
$info['filesize'] = (int)$matches[1];
}
return $info;
}
These servers have the same PHP version with the same PHP-Curl version. These are the two different headers of the curl result:
Working one:
HTTP/1.1 302 Found Date: Tue, 12 Jun 2012 07:04:35 GMT Server:
Apache/2.2.16 (Debian) X-Powered-By: PHP/5.3.3-7+squeeze13 Expires:
Thu, 19 Nov 1981 08:52:00 GMT Cache-Control: no-store,
no-cache,must-revalidate, post-check=0, pre-check=0 Pragma: no-cache
Location:
http://stor1076.uploaded.to/dl/b3411ded-0f45-4efc-b705-8c8ac89b5e41
Vary: Accept-Encoding Connection: close Content-Type: text/html
HTTP/1.1 200 OK Server: nginx/1.0.5 Date: Tue, 12 Jun 2012 07:04:35
GMT Content-Type: video/x-msvideo Content-Length: 733919232
Last-Modified: Tue, 29 May 2012 15:10:07 GMT Connection: keep-alive
Content-Disposition: attachment;
filename="Saw.[Spanish.DVDRip].[XviD-Mp3].by.SDG.avi" Accept-Ranges:
bytes
Non working one:
HTTP/1.1 302 Found Date: Tue, 12 Jun 2012 07:05:26 GMT Server:
Apache/2.2.16 (Debian) X-Powered-By: PHP/5.3.3-7+squeeze13 Expires:
Thu, 19 Nov 1981 08:52:00 GMT Cache-Control: no-store, no-cache,
must-revalidate, post-check=0, pre-check=0 Pragma: no-cache Location:
http://stor1164.uploaded.to/dl/22c3d242-365d-4e1e-b903-f1e2b81812c2
Vary: Accept-Encoding Connection: close Content-Type: text/html
Cookies are set OK (with login), and other simple Curl functions are working fine.
Also, I did a curl_getinfo($ch, CURLINFO_HTTP_CODE) and give me that result:
Working one:
200
Non working one:
302
Any idea?
On the working one you seem to be running Apache as well as nginx. You can see there are two HTTP responses:
HTTP/1.1 302 Found Date: Tue, 12 Jun 2012 07:04:35 GMT Server:
Apache/2.2.16 (Debian) HTTP/1.1 200 OK Server: nginx/1.0.5
So, your setup differs. I don't know how exactly they are running together, but this gives some insight and may help you solve it: http://kbeezie.com/view/apache-with-nginx/
Ok, it was a open_basedir problem. Thanks guys.

use cURL to get HTTP header and save to variable

I am using this to grab a XML feed and the HTTP headers
// Initiate the curl session
$ch = curl_init();
// Set the URL
curl_setopt($ch, CURLOPT_URL, $url);
// Allow the headers
curl_setopt($ch, CURLOPT_HEADER, true);
// Return the output instead of displaying it directly
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
// Execute the curl session
$output = curl_exec($ch);
// Close the curl session
curl_close($ch);
// Cache feed
file_put_contents($filename, $output);
// XML to object
$output = simplexml_load_string($output);
// Return the output as an array
return $output;
And it returns these headers
HTTP/1.1 200 OK
Cache-Control: public, max-age=30
Content-Length: 5678
Content-Type: text/xml
Expires: Tue, 22 Nov 2011 15:12:16 GMT
Last-Modified: Tue, 22 Nov 2011 15:11:46 GMT
Vary: *
Server: Microsoft-IIS/7.0
Set-Cookie: ASP.NET_SessionId=1pfijrmsqndn5ws3124csmhe; path=/; HttpOnly
Data-Last-Updated: 11/22/2011 15:11:45
X-AspNet-Version: 4.0.30319
X-Powered-By: ASP.NET
Date: Tue, 22 Nov 2011 15:11:46 GMT
I only want it to return one part of the HTTP header, which is
Expires: Tue, 22 Nov 2011 15:12:16 GMT
and save that to a variable, how do I do this? I have been looking through the PHP manual but can't work it out
preg_match('~^(Expires:.+?)$~ism', $headers, $result);
return $result[1];
You can use PHP's get_headers($url, 1) function, which returns all the header information in an array, and if you set the second optional param to 1 (non-zero), then it parses it into an associative array like:
array(
[...] => ...,
[Expires] => 'what you need',
[...] => ...
)
http://php.net/manual/en/function.get-headers.php

Categories