I want to set up a webpage to use proxy automatically. Here is my script:
<?
$url = 'http://www.sciencedirect.com/science/jrnlallbooks/a/fulltext';
$proxy = '200.93.148.72:3128';
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,$url);
curl_setopt($ch, CURLOPT_PROXY, $proxy);
curl_setopt($ch, CURLOPT_TIMEOUT, 0);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_HEADER, 1);
curl_setopt($ch, CURLOPT_COOKIEJAR, "my_cookies.txt");
curl_setopt($ch, CURLOPT_COOKIEFILE, "my_cookies.txt");
$curl_scraped_page = curl_exec($ch);
echo $curl_scraped_page;
?>
I can open the webpage and the header shows:
HTTP/1.0 200 OK Date: Fri, 23 Nov 2012 18:46:41 GMT Last-Modified: Fri, 23 Nov 2012 18:46:41 GMT Set-Cookie: MIAMISESSION=86e5ecb0-359b-11e2-b116-00000aab0f6c:3531149201; path=/; domain=.sciencedirect.com; Set-Cookie: USER_STATE_COOKIE=; expires=Thu, 01 Jan 1970 23:59:59 GMT; path=/; domain=.sciencedirect.com; Set-Cookie: SD_REMOTEACCESS=; expires=Thu, 01 Jan 1970 23:59:59 GMT; path=/; domain=.sciencedirect.com; Set-Cookie: MIAMIAUTH=6c1869e3dd5d3ca5644fdd359099d551fee57ff4f19a0d9e30c92bcc3f4cdcb755a638c57790c6f118d4a8601a914733e770454895a95214fd2a92b748418c15aabbe7e39dbfe22d18a337761caf3eebb621aa3f17803d29fa1a241d10f4aad71e83423e9562a1ec67194a18c7a016cd36828cdb6ccdaef46d038a2ee15429cd0ee88a636ec51602cee8d34e3397f0c720230f6ab68fbc74c285431372f89886ba1bbbb03f6873e2804f1577f52679f16123dd0c07d70ab0b92145c1c383e4155512e57b8da9452ad570394af0c66b0859739b1e77c2d98372d5a1b978828531f3a042a816bf4a9edbe45d4f9197a685aa1506ae57ec1593efd428842244a96f9d2033b43ccf50a14843907943eb57b7c9dd1bef11603f9e686aad6847870ac6fec520209a31df9efb3d0ee4e24341c4c5dd6c12060a6a624c3ff60ec16286f7cb6c3839f8f375c00c836958eada8d4900baa294fa3645c02f1b3ac78c7bc78bc2d79f5f4e038b6ae465d63f0100a53731ec826eba3c6f8f648bf03d6ac7d450788f0362055ca413073d9333348cacdc6e4d6222a420a78620a968b185954fcc76b3a9a63f2e62f9; path=/; domain=.sciencedirect.com; Set-Cookie: TARGET_URL=fcf74dd786744d87fbaaaf8652a764ab4a79b0d3ed681139e9106923760631052596d348948479933da48b3723069bbf09065290c950dc02c1f0d1436659ad5a; path=/; domain=.sciencedirect.com; Set-Cookie: MIAMIAUTH=6c1869e3dd5d3ca5644fdd359099d551fee57ff4f19a0d9e30c92bcc3f4cdcb755a638c57790c6f118d4a8601a914733e770454895a95214fd2a92b748418c15aabbe7e39dbfe22d18a337761caf3eebb621aa3f17803d29fa1a241d10f4aad71e83423e9562a1ec67194a18c7a016cd36828cdb6ccdaef46d038a2ee15429cd0ee88a636ec51602cee8d34e3397f0c720230f6ab68fbc74c285431372f89886ba1bbbb03f6873e2804f1577f52679f16123dd0c07d70ab0b92145c1c383e4155512e57b8da9452ad570394af0c66b0859739b1e77c2d98372d5a1b978828531f3a042a816bf4a9edbe45d4f9197a685aa1506ae57ec1593efd428842244a96f9d2033b43ccf50a14843907943eb57b7c9dd1bef11603f9e686aad6847870ac6fec520209a31df9efb3d0ee4e24341c4c5dd6c12060a6a624c3ff60ec16286f7cb6c3839f8f375c00c836958eada8d4900baa294fa3645c02f1b3ac78c7bc78bc2d79f5f4e038b6ae465d63f0100a53731ec826eba3c6f8fe5378db869312c80a0addc0d8946f7f6552daa333f2e38da51e23c1d44ae41176c1b2e70f7b144f63a44c25741cd0126; path=/; domain=.sciencedirect.com; Content-Type: text/html Expires: Tue, 01 Jan 1980 05:00:00 GMT X-RE-Ref: 0 19194695 Server: www.sciencedirect.com P3P: CP="IDC DSP LAW ADM DEV TAI PSA PSD IVA IVD CON HIS TEL OUR DEL SAM OTR IND OTC" Vary: Accept-Encoding, User-Agent X-Cache: MISS from alejandria.ufps.edu.co X-Cache-Lookup: HIT from alejandria.ufps.edu.co:3128 Via: 1.0 alejandria.ufps.edu.co (squid/3.0.STABLE15) Proxy-Connection: close
However, the proxy only works on this page. When I click other links on this page, no proxy is loaded. Please help me to solve this. How to improve my script? I want a whole website (all links) to use the proxy. How to set up?
It seems that You need to use regular expressions to modify any links within the HTML code (for example <a href="...">) to point to Your script. Then You have to set up an argument to cURL, so You'll get proper page, so it'll look something like http://YourSite.com/proxy.php?site=http://example.com/smth/foo.php
Related
I searched for this on google but all the links refer to curl not being able to save the file, but on my case is diferent, curl is saving the file correctly but the contents are only as you can see:
# Netscape HTTP Cookie File
# https://curl.haxx.se/docs/http-cookies.html
# This file was generated by libcurl! Edit at your own risk.
--- no cookie contents here ---- only header ----
I have this website running in Xampp and everything works.
when I migrated to IIS, the only problem that I have is curl not saving the contents of cookies in the file.
I have no clue what i did wrong.
//Initiate cURL.
$ch = curl_init($url);
//The JSON data.
$jsonData = array(
'login' => $login,
'password' => $password,
'idioma' => $idioma,
'server' => $server,
'sistema' => $sistema
);
//Encode the array into JSON.
$jsonDataEncoded = json_encode($jsonData);
//curl_setopt($ch, CURLOPT_COOKIEJAR, dirname(__FILE__) . '/cookie.txt');
curl_setopt($ch, CURLOPT_COOKIEJAR, "C:\cookies\cookieFile.txt");
curl_setopt($ch, CURLOPT_COOKIEFILE, "C:\cookies\cookieFile.txt");
//Tell cURL that we want to send a POST request.
curl_setopt($ch, CURLOPT_POST, 1);
//curl_setopt($ch, CURLOPT_HEADER, true);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
//Attach our encoded JSON string to the POST fields.
curl_setopt($ch, CURLOPT_POSTFIELDS, $jsonDataEncoded);
//Set the content type to application/json
curl_setopt($ch, CURLOPT_HTTPHEADER, array('Content-Type: application/json'));
//Execute the request
$data=curl_exec($ch);
if ($data) {
//var_dump($data);
}else
{
echo "Não foi possível abrir o Sistema no Serviço Eticadata";
}
curl_close($ch);
To make sure I am recieving header cookies I added this
curl_setopt($ch, CURLOPT_HEADER, 1);
This is in the response when I var_dump Data
HTTP/1.1 200 OK Cache-Control: no-cache Pragma: no-cache Content-Type: application/json; charset=utf-8 Expires: -1 Server: Microsoft-IIS/10.0 Set-Cookie: ASP.NET_SessionId=ke2s2lf0mkdbqlcfbghrqgg5; path=/; HttpOnly X-AspNet-Version: 4.0.30319 Set-Cookie: .eti_ASPXAUTH=5DE2B541B7CCE00C4356B3D0D9CA0D82C9819A5EB9E915924913DCB3B7FB511D6A27026CECC878FD295B1E8F262B53A8A9F9946318BEEE4C0487ED9F9B64547451EDC9D2C4458A8FEECFEABBD24EE0CF2DD2FCD80B4C3931622309DB443E35066A71B8C01A160F4DDA5FE8594E4C7DD6E62B2D55EB25FCCC9C1C7F304F3285E0; path=/; HttpOnly Set-Cookie: eti_sessionInfo=YwBzAHcAcwBxAGwAXABjAG8AbQBwAHUAcwCnAFMAaQBzAHQAZQBtAGEAQwBTAFcApwBDAFMAVwCnAEUAeAAgADIAMAAxADkApwAxAKcAUABUAC0AUABUAA==; path=/ X-Powered-By: ASP.NET Date: Wed, 12 Jun 2019 15:27:01 GMT Content-Length: 56350
But looking at C:\cookies\cookieFile.txt its contents are:
# Netscape HTTP Cookie File
# https://curl.haxx.se/docs/http-cookies.html
# This file was generated by libcurl! Edit at your own risk.
And no cookies inside.
Edit
Heres the verbose of curl
* upload completely sent off: 106 out of 106 bytes
< HTTP/1.1 200 OK
< Cache-Control: no-cache
< Pragma: no-cache
< Content-Type: application/json; charset=utf-8
< Expires: -1
< Server: Microsoft-IIS/10.0
* cookie 'ASP.NET_SessionId' dropped, domain 'cswsql' must not set cookies for 'cswsql'
< Set-Cookie: ASP.NET_SessionId=khdalre2kjro4usbaq4co33c; path=/; HttpOnly
< X-AspNet-Version: 4.0.30319
* cookie '.eti_ASPXAUTH' dropped, domain 'cswsql' must not set cookies for 'cswsql'
< Set-Cookie: .eti_ASPXAUTH=75524EDA1707527A449A395913EDB222D37F40171D44037CA430958BD2F24BF90C2975461F57F58EC6A6E0450ACD98AE64092B13CCDAA73E4CE7207AA2CE834688CBD4113C82AFAB4F513FC2DEFFCEF34496FF47896BCF8BBB3424FD6F6F8F1593B7869E2647AEB82F9D6CD89B2E939E38036D4A94635072996E88528225E793; path=/; HttpOnly
* cookie 'eti_sessionInfo' dropped, domain 'cswsql' must not set cookies for 'cswsql'
< Set-Cookie: eti_sessionInfo=YwBzAHcAcwBxAGwAXABjAG8AbQBwAHUAcwCnAFMAaQBzAHQAZQBtAGEAQwBTAFcApwBDAFMAVwCnAEUAeAAgADIAMAAxADkApwAxAKcAUABUAC0AUABUAA==; path=/
< X-Powered-By: ASP.NET
< Date: Thu, 13 Jun 2019 14:09:21 GMT
< Content-Length: 56350
<
* Connection #0 to host cswsql left intact
I have a
$result = curl_exec($ch)
the $ value is like that
HTTP/1.1 302 Found
Date: Wed, 18 Apr 2018 12:45:05 GMT
Set-Cookie: OAMAuthnHintCookie=0#1524055505; httponly; secure; path=/; domain=.test.com
Set-Cookie: OAMRequestContext_test.test.com:443_527635=Rv52rjM82f3htVYzT+Lp0g==;max-age=300; httponly; secure; path=/
Location: https://id.test.com/obrareq.cgi?encquery%3DE6zb4nAIzYfopY8L5SbbJJPLfvrkN7Y1RkKgv4%2FSzBKmT1cY%2BhRn0A3AhCDxGFIB10DLwLMp%2BcR40CHFKhdrh2aZcEck%2Bd2pzikJ3WzWCAo5LiVW8O3CGPVoeFXUBY2orJxN9zSZXNXkAzg%2F%2F2twT%2FS1ZIUlox8fyQrKf6mITSrqbgKhn5dcC5CR79rJDCO75VEIU472JptWmPlBlEkyFT1XRO%2BUzXQHUwui92%2FGCh34PbbDrPajiyU71ycb03ffcCt0Sl1tKVNw2S%2BsUe81VH1jgV8yLWXslvl2SzsqpQUcZVZdi80HEM2ppQTsvECX%2BiyWnZ49nVBxp3YqU4nlhkAIaNaEbTEpPVF%2FvCJSuHo%3D%20agentid%3DWgtest%20ver%3D1%20crmethod%3D2
Content-Length: 676
Cache-Control: max-age=0
Expires: Wed, 18 Apr 2018 12:45:05 GMT
I extract the location: url with:
preg_match('/(Location: https:\/\/id\.test\.com\/obrareq\.cgi\?encquery)(.*)/', $res, $location);
and put the result in $found:
$found=$location[2];
then I want to use this url in a new curl_setop
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,'https://id.test.com/obrareq.cgi?encquery'.$found);
......
but it doesn't give me any result.
If I do the same curl_setop with a manual copy of the url it works.
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,'https://id.test.com/obrareq.cgi?encquery%3DE6zb4nAIzYfopY8L5SbbJJPLfvrkN7Y1RkKgv4%2FSzBKmT1cY%2BhRn0A3AhCDxGFIB10DLwLMp%2BcR40CHFKhdrh2aZcEck%2Bd2pzikJ3WzWCAo5LiVW8O3CGPVoeFXUBY2orJxN9zSZXNXkAzg%2F%2F2twT%2FS1ZIUlox8fyQrKf6mITSrqbgKhn5dcC5CR79rJDCO75VEIU472JptWmPlBlEkyFT1XRO%2BUzXQHUwui92%2FGCh34PbbDrPajiyU71ycb03ffcCt0Sl1tKVNw2S%2BsUe81VH1jgV8yLWXslvl2SzsqpQUcZVZdi80HEM2ppQTsvECX%2BiyWnZ49nVBxp3YqU4nlhkAIaNaEbTEpPVF%2FvCJSuHo%3D%20agentid%3DWgtest%20ver%3D1%20crmethod%3D2y');
Any idea of what i did wrong?
......
Code:
$ch = curl_init();
curl_setopt ($ch, CURLOPT_URL, "https://detail.1688.com/offer/543783250479.html?sk=consign");
curl_setopt($ch, CURLOPT_HEADER, true);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$a = curl_exec($ch);
$data = curl_exec($ch);
curl_close($ch);
var_dump($data);
Result:
string(339) "HTTP/1.1 403 Forbidden
Server: nginx/1.11.1
Date: Tue, 16 May 2017 03:46:32 GMT
Content-Type: text/html; charset=utf-8
Content-Length: 169
Connection: keep-alive
<html>
<head><title>403 Forbidden</title></head>
<body bgcolor="white">
<center><h1>403 Forbidden</h1></center>
<hr><center>nginx/1.11.1</center>
</body>
</html>
"
While I run curl -I https://detail.1688.com/offer/543783250479.html?sk=consign in my shell, it returns 200:
HTTP/1.1 200 OK
Date: Tue, 16 May 2017 03:46:51 GMT
Content-Type: text/html;charset=GBK
Connection: keep-alive
Vary: Accept-Encoding
Expires: Thu, 01-Jan-1970 00:00:00 GMT
Cache-Control: max-age=0,s-maxage=0
b2c_auction: 543783250479
atp_isdpp: 99vb2b-2295161471
page_cache_info: {"is_new":true,"create_time":"2017-05-16T11:46:51","expire_time":3600000}
X-Cache: MISS TCP_REFRESH_MISS dirn:-2:-2
Via: aserver010103196008.et2[69,200-0,M]
url-hash: id=543783250479&detail.1688.com
Server: Tengine/Aserver
Strict-Transport-Security: max-age=31536000
Timing-Allow-Origin: *
EagleEye-TraceId: 0b83e0c214949064118297808e926
Could anyone please give me some hints about why I get 403 by cURL in PHP?
Environment:
Copyright (c) 1997-2016 The PHP Group
Zend Engine v3.0.0, Copyright (c) 1998-2016 Zend Technologies
with Zend OPcache v7.0.8-2+deb.sury.org~xenial+1, Copyright (c) 1999-2016, by Zend Technologies
with blackfire v1.10.6, https://blackfire.io, by Blackfireio Inc.
The returning headers without using an useragent:
HTTP/1.1 302 Moved Temporarily
Date: Tue, 16 May 2017 04:15:46 GMT
Content-Type: text/html
Content-Length: 266
Connection: keep-alive
Location: http://127.0.0.1/?sk=consign
X-Cache: MISS TCP_MISS dirn:-2:-2
Via: aserver010103196008.et2[0,302-0,M]
url-hash: id=543783250479&detail.1688.com
Server: Tengine/Aserver
Strict-Transport-Security: max-age=31536000
Timing-Allow-Origin: *
EagleEye-TraceId: 0b83dc9c14949081466171756eb58d
The important part is:
Location: http://127.0.0.1/?sk=consign
if I use an useragent, I get:
HTTP/1.1 200 OK
Date: Tue, 16 May 2017 04:17:30 GMT
Content-Type: text/html;charset=GBK
Transfer-Encoding: chunked
Connection: keep-alive
Vary: Accept-Encoding
Expires: Thu, 01-Jan-1970 00:00:00 GMT
Cache-Control: max-age=0,s-maxage=0
b2c_auction: 543783250479
atp_isdpp: 99vb2b-2295161471
page_cache_info: {"is_new":true,"create_time":"2017-05-16T12:17:30","expire_time":3600000}
X-Cache: MISS TCP_MISS dirn:-2:-2
Via: aserver011128044194.eu13[106,200-0,M]
url-hash: id=543783250479&detail.1688.com
Server: Tengine/Aserver
Strict-Transport-Security: max-age=31536000
Timing-Allow-Origin: *
EagleEye-TraceId: 0b802cd414949082503341644e23a0
Which is correct and it returns the desired html
Code used:
$url = "http://detail.1688.com/offer/543783250479.html?sk=consign";
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_USERAGENT, "Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:53.0) Gecko/20100101 Firefox/53.0");
$html = curl_exec($ch);
curl_close($ch);
print $html;
In curl -I stands for head request. Try setting the head request in your php as
:
curl_setopt($ch, CURLOPT_NOBODY, true);
And give it a try
I'm working with an API, using cURL I have received a set of data.
The data appears to be half HTTP request and half JSON. I'm not sure why it's mixed but essentially I get this response when I do a var_dump:
string(873) "HTTP/1.1 200 OK cache-control: no-cache, no-store, must-revalidate, pre-check=0, post-check=0 content-length: 153 content-type: application/json;charset=utf-8 date: Mon, 10 Nov 2014 10:58:49 UTC expires: Tue, 31 Mar 1981 05:00:00 GMT last-modified: Mon, 10 Nov 2014 10:58:49 GMT ml: A pragma: no-cache server: tsa_b set-cookie: guest_id=v1%3A141561712923128379; Domain=.twitter.com; Path=/; Expires=Wed, 09-Nov-2016 10:58:49 UTC status: 200 OK strict-transport-security: max-age=631138519 x-connection-hash: 57175e4dba3d726bebb399072c225958 x-content-type-options: nosniff x-frame-options: SAMEORIGIN x-transaction: 2e4b8e053e615c75 x-ua-compatible: IE=edge,chrome=1 x-xss-protection: 1; mode=block {"token_type":"bearer","access_token":"AAAAAAAAAAAAAAAAAAAAAMVfbQAAAAAAK7qYRQOgdZ771TrJ6pZ7nugCwVQ%3DLKcongtwy3lcBDbPSEreC9DfhJk3Gm7qyQInqhFAxYvo1clv4S"}"
That's the full data back. It's got HTTP info at the beginning and then part JSON at the end.
The only bit I need from this is the access_token data.
If it was just JSON then I could use json_decode to get the access_token out but because it's got all the HTTP info at the beginning json_decode cannot understand it and gives the result NULL.
How can I remove the HTTP part so I can just grab the access_token data?
ETA: my request is made through cURL, so the var I'm dumping out is $response
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,$auth_url);
curl_setopt($ch, CURLOPT_HTTPHEADER, $headers);
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_POSTFIELDS, "grant_type=client_credentials");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$header = curl_setopt($ch, CURLOPT_HEADER, 1);
$result = curl_exec($ch);
curl_close($ch);
The result I receive roughly matches the expected result given in the Twitter documentation so I don't think the data is corrupt/incorrect: https://dev.twitter.com/oauth/reference/post/oauth2/token
Switch of header output and remove
$header = curl_setopt($ch, CURLOPT_HEADER, 1);
or replace with
curl_setopt($ch, CURLOPT_HEADER, false);
$a='HTTP/1.1 200 OK cache-control: no-cache, no-store, must-revalidate, pre-check=0, post-check=0 content-length: 153 content-type: application/json;charset=utf-8 date: Mon, 10 Nov 2014 10:58:49 UTC expires: Tue, 31 Mar 1981 05:00:00 GMT last-modified: Mon, 10 Nov 2014 10:58:49 GMT ml: A pragma: no-cache server: tsa_b set-cookie: guest_id=v1%3A141561712923128379; Domain=.twitter.com; Path=/; Expires=Wed, 09-Nov-2016 10:58:49 UTC status: 200 OK strict-transport-security: max-age=631138519 x-connection-hash: 57175e4dba3d726bebb399072c225958 x-content-type-options: nosniff x-frame-options: SAMEORIGIN x-transaction: 2e4b8e053e615c75 x-ua-compatible: IE=edge,chrome=1 x-xss-protection: 1; mode=block {"token_type":"bearer","access_token":"AAAAAAAAAAAAAAAAAAAAAMVfbQAAAAAAK7qYRQOgdZ771TrJ6pZ7nugCwVQ%3DLKcongtwy3lcBDbPSEreC9DfhJk3Gm7qyQInqhFAxYvo1clv4S"}"';
preg_match("/\{.*\}/",$a,$m);
$ja=json_decode($m[0]);
var_dump($ja,$m);
output:
object(stdClass)[1]
public 'token_type' => string 'bearer' (length=6)
public 'access_token' => string 'AAAAAAAAAAAAAAAAAAAAAMVfbQAAAAAAK7qYRQOgdZ771TrJ6pZ7nugCwVQ%3DLKcongtwy3lcBDbPSEreC9DfhJk3Gm7qyQInqhFAxYvo1clv4S' (length=112)
I'm just testing some code for image compression and then uploading to the Imgur API. But instead of just getting the response content I seem to be getting the header data as well and I can't figure out how to just parse the JSON. Here's what I have so far:
compress_test.php
include("imgur_upload.php");
echo compress_and_upload();
imgur_upload.php
function compress_and_upload(){
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, 'https://api.imgur.com/3/image.json');
curl_setopt($ch, CURLOPT_POST, TRUE);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, TRUE);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, FALSE); // for localhost
curl_setopt($ch, CURLOPT_HEADER, FALSE); // tried this but no change
curl_setopt($ch, CURLOPT_HTTPHEADER, array('Authorization: Client-ID ' . IMGUR_CLIENT_ID));
curl_setopt($ch, CURLOPT_POSTFIELDS, array('image' => 'http://userserve-ak.last.fm/serve/300x300/51654499.png', 'type' => 'url'));
$reply = curl_exec($ch);
curl_close($ch);
return $reply;
}
What I'm expecting to be echoed is just the JSON data I believe.
Here is what I'm getting:
BHTTP/1.1 200 OK
Server: nginx
Date: Sat, 03 Aug 2013 05:14:53 GMT
Content-Type: application/json
Content-Length: 325
Connection: keep-alive
Set-Cookie: IMGURSESSION=dhjtli4c84koo2jbr8lild7ji7; path=/; domain=.imgur.com
Expires: Thu, 19 Nov 1981 08:52:00 GMT
Pragma: no-cache
Set-Cookie: _nc=1; path=/; domain=.imgur.com; httponly
Access-Control-Allow-Origin: *
Access-Control-Allow-Methods: GET, PUT, POST, DELETE, OPTIONS
Access-Control-Allow-Headers: Authorization, Content-Type, Accept, X-Mashape-Authorization
X-RateLimit-ClientLimit: 12500
X-RateLimit-ClientRemaining: 12473
X-RateLimit-UserLimit: 500
X-RateLimit-UserRemaining: 497
X-RateLimit-UserReset: 1375510414
Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0
Set-Cookie: UPSERVERID=i-614a2006; path=/
Accept-Ranges: bytes
X-Imgur-Cached: 0
{"data":{"id":"HE981gx","title":null,"description":null,"datetime":1375506893,"type":"image\/png","animated":false,"width":300,"height":300,"size":458117,"views":0,"bandwidth":0,"favorite":false,"nsfw":null,"section":null,"deletehash":"qkl9lNDWCRR52Z0","link":"http:\/\/i.imgur.com\/HE981gx.png"},"success":true,"status":200}°
I ran the code on my own servers and used your code, didn't have this issue.