I have a script which is attempting to pull the Lists from Mailchimp. The URL that is giving this list is:
https://us8.api.mailchimp.com/1.3/?output=php&method=lists&apikey=XXXXXHIDDENKEYXXXXX-us8
By directly visiting the link, the serialized array of lists is displayed. The issues is when I am using PHP to pull this information:
$response = "";
fwrite($sock, $payload);
stream_set_timeout($sock, $this->timeout);
$info = stream_get_meta_data($sock);
while ((!feof($sock)) && (!$info["timed_out"])) {
$response .= fread($sock, $this->chunkSize);
$info = stream_get_meta_data($sock);
}
fclose($sock);
ob_end_clean();
Once I print the $response variable, I get the following output:
HTTP/1.0 302 Moved Temporarily
Server: AkamaiGHost
Content-Length: 0
Location: https://us8.api.mailchimp.com/1.3/?output=php&method=lists&apikey=XXXXXHIDDENKEYXXXXX-us8
Date: Thu, 23 Aug 2018 16:33:55 GMT
Connection: close
Here is a link to the MCAPI library that I am using to get attempt the connection:
https://github.com/jbrooksuk/MCAPI-PHP
Why would this information be available when directly visiting the link but blocked via PHP?
Related
I am a beginner in Soap Web Services. I have looked over all the questions and answers related to mine but they have different scenarios and code and they don't seem to match with my requirement.
I am using a webservice created by a third party for product listing. I want to just fetch some data so that I may know the basic workflow of how Webservices work. Here is my current code
require_once('nuSOAP/lib/nusoap.php');
$wsdl = "http://ws.anchordistributors.com/anchorwebservice.asmx?wsdl";
$client = new nusoap_client($wsdl, 'wsdl');
// Input params
$username = "mylogin_id";
$password = "mylogin_password";
// In this demo, we use json data , you can use any other data format for same
$json = '{"param1":"value1","param2":"value2"}';
$client->setCredentials($username, $password);
$error = $client->getError();
if ($error)
{
echo $error; die();
}
$action = "GetCountries"; // webservice method name
$result = array();
if (isset($action))
{
$result['response'] = $client->call($action, $json);
}
echo "<h3>Output : </h3>";
print_r ($result['response']);
echo "<h2>Request</h2>";
echo "<pre>" . htmlspecialchars($client->request, ENT_QUOTES) . "</pre>";
echo "<h2>Response</h2>";
echo "<pre>" . htmlspecialchars($client->response, ENT_QUOTES) . "</pre>";
This is what I am getting back:
Output :
Array ( [faultcode] => soap:Server [faultstring] =>
System.Web.Services.Protocols.SoapException: Server was unable to
process request. ---> System.NullReferenceException: Object reference
not set to an instance of an object. at
AnchorWebservice.AnchorWebservice.AuthenticateUser(String sFunction)
at AnchorWebservice.AnchorWebservice.GetCountries() --- End of inner
exception stack trace --- [detail] => ) Request
POST /anchorwebservice.asmx HTTP/1.0 Host: ws.anchordistributors.com
User-Agent: NuSOAP/0.9.5 (1.123) Content-Type: text/xml;
charset=ISO-8859-1 SOAPAction:
"http://tempuri.org/AnchorWebservice/AnchorWebservice/GetCountries"
Authorization: Basic ODI1NjQ3OkVJTExDODI1 Content-Length: 401
{"param1":"value1","param2":"value2"}
Response
HTTP/1.1 500 Internal Server Error. Connection: close Date: Fri, 06
Oct 2017 18:47:28 GMT Server: Microsoft-IIS/6.0 X-Powered-By: ASP.NET
X-AspNet-Version: 1.1.4322 Cache-Control: private Content-Type:
text/xml; charset=utf-8 Content-Length: 744
soap:Server
System.Web.Services.Protocols.SoapException: Server was unable to process request. ---> System.NullReferenceException:
Object reference not set to an instance of an object. at
AnchorWebservice.AnchorWebservice.AuthenticateUser(String sFunction)
at AnchorWebservice.AnchorWebservice.GetCountries() --- End of
inner exception stack trace ---
I have no idea why is this creating issue on what logic. Being a beginner I don't know the sense of the issue. I just want to be able to use GetCountries method to fetch the array of countries.
Im trying to recieve some response from a website/server, and all i get in return is:
System.ArgumentException','Object of type \'System.DBNull\' cannot be converted to type \'System.String
my PHP code:
$url = 'website';
$fields = array('searchstring' => urlencode('solkrem'), 'menuID' => urlencode(0), 'genses' => urlencode('20170201178577A2F54'), 'removeimages' => urlencode(false));
function httpPost($url, $data)
{
$curl = curl_init($url);
curl_setopt($curl, CURLOPT_POST, true);
curl_setopt($curl, CURLOPT_POSTFIELDS, http_build_query($data));
curl_setopt($curl, CURLOPT_RETURNTRANSFER, true);
$response = curl_exec($curl);
curl_close($curl);
return $response;
}
echo "<br><br>";
$result = httpPost($url,$fields);
var_dump($result);
I also do know, when im trying it trough requestmaker.com with the data and url, i get the response i wanted...
Am i not encoding my fields right, or what could be the cause?
EDIT: some info from requestmaker.com :
Request Headers Sent:
POST xxxxx HTTP/1.1
Host: xxx.com
Accept: */*
Content-Type: text/html
Content-Length: 75
request header recieved:
HTTP/1.1 200 OK
Date: Thu, 02 Feb 2017 14:03:20 GMT
Server: Microsoft-IIS/6.0
X-Powered-By: ASP.NET
X-AspNet-Version: 4.0.30319
Cache-Control: private
Expires: Thu, 02 Feb 2017 14:03:19 GMT
Content-Type: text/html; charset=utf-8
Content-Length: 58300
EDIT 3:
I found out, even that the site is asking me to add the details with the & seperator, it wont work if its like this, and it will produce same error:
But if it looks like this, without the & seperator, it works. I dont know how its sent, cause its backend PHP on the test page.
Also, if i dont send any fields, it will give same output as the error i have.
update 4:
FRom their website, i saw they are sending it like:
'searchstring=solkrem\r\nmenuID=0\r\ngenses=20170201178577A2F54\r\nremoveimages=false
would that do something? hmm.
I think your error is with the "removeimages" parameter...
You have:
'removeimages' => urlencode(false)
And it should probably be:
'removeimages' => urlencode('false')
URL encoding a Boolean value of false will not pass anything in the query string and create a null value on the other end.
EDIT: the answer is in the comments to the marked answer.
I am currently working with updating a few key components on a mobile web site. The site uses data from a different server to display student schedules. Recently this other site (over which I have zero control) was subject to a major overhaul and naturally I now have to update the mobile web site.
What I am trying to do is to access an iCal file and parse it. Since the site I am working on runs in an environment that does not have the curl-library nor have fopen wrappers properly set up I have resorted to the method described here (number 4, using a socket directly).
My current issue is that instead of getting the iCal-file I get a 301 error. However, if I attempt to access the same file (via the same URL) in a web browser it works just fine.
EDIT:
I added a bit of logging and here is what came out of it:
-------------
Querying url:
https://someUrl/schema/ri654Q055ZQZ60QbQ0ygnQ70cWny067Z0109Zx4h0Z7o525Y407Q.ics
Response:
HTTP/1.1 301 Moved Permanently
Server: nginx/1.2.8
Date: Sun, 11 Aug 2013 14:08:36 GMT
Content-Type: text/html
Content-Length: 184
Connection: close
Location:
https://someUrl/schema/ri654Q055ZQZ60QbQ0ygnQ70cWny067Z0109Zx4h0Z7o525Y407Q.ics
<html>
<head><title>301 Moved Permanently</title></head>
<body bgcolor="white">
<center><h1>301 Moved Permanently</h1></center>
<hr><center>nginx/1.2.8</center>
</body>
</html>
Redirect url found: https://someUrl/schema/ri654Q055ZQZ60QbQ0ygnQ70cWny067Z0109Zx4h0Z7o525Y407Q.ics
The new location I am getting is identical to the original one.
This is the code used:
function getRemoteFile($url)
{
error_log("------------- \r\nQuerying url: " . $url, 3, "error_log.log");
// get the host name and url path
$parsedUrl = parse_url($url);
$host = $parsedUrl['host'];
if (isset($parsedUrl['path'])) {
$path = $parsedUrl['path'];
} else {
// the url is pointing to the host like http://www.mysite.com
$path = '/';
}
if (isset($parsedUrl['query'])) {
$path .= '?' . $parsedUrl['query'];
}
if (isset($parsedUrl['port'])) {
$port = $parsedUrl['port'];
} else {
// most sites use port 80
// but we want port 443 because we are using https
error_log("Using port 443\r\n" . $url, 3, "error_log.log");
$port = 443;
}
$timeout = 10;
$response = '';
// connect to the remote server
$fp = fsockopen($host, $port, $errno, $errstr, $timeout );
if( !$fp ) {
echo "Cannot retrieve $url";
} else {
$payload = "GET $path HTTP/1.0\r\n" .
"Host: $host\r\n" .
"User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.0.3) Gecko/20060426 Firefox/1.5.0.3\r\n" .
"Accept: */*\r\n" .
"Accept-Language: sv-SE,sv;q=0.8,en-us,en;q=0.3\r\n" .
"Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7\r\n" .
"Referer: https://$host\r\n\r\n";
error_log("\nPAYLOAD: " . $payload, 3, "error_log.log");
// send the necessary headers to get the file
fputs($fp, $payload);
// retrieve the response from the remote server
while ( $line = stream_socket_recvfrom( $fp, 4096 ) ) {
$response .= $line;
}
fclose( $fp );
// naively find location redirect
$location_pos = strpos($response, "Location:");
if($location_pos){
$location_pos += 10;
$new_url = substr($response, $location_pos, strpos($response, "\r\n\r\n") - $location_pos);
error_log("\nRedirect url found: " . $new_url, 3, "error_log.log");
}else{
//log the response
error_log($response, 3, "error_log.log");
}
// strip the headers
$pos = strpos($response, "\r\n\r\n");
$response = substr($response, $pos + 4);
}
// return the file content
return $response;
}
HTTP Response Code 301 is a permanent redirect, not an error.
Your code will have to follow that redirect in order to access the resource.
For example, http://google.com/ returns a 301 in order to redirect users to http://www.google.com/ instead.
$ curl -I http://google.com/
HTTP/1.1 301 Moved Permanently
Location: http://www.google.com/
Content-Type: text/html; charset=UTF-8
Date: Sun, 11 Aug 2013 01:25:34 GMT
Expires: Tue, 10 Sep 2013 01:25:34 GMT
Cache-Control: public, max-age=2592000
Server: gws
Content-Length: 219
X-XSS-Protection: 1; mode=block
X-Frame-Options: SAMEORIGIN
Alternate-Protocol: 80:quic
You can see the 301 response on line 2, followed by the Location header which tells the web browser where to go instead.
What likely happened was that during this major overhaul, they moved the resource to another location. In order not to break any users bookmarks or calendar, they used a 301 redirect so that clients will automatically fetch the resource from the new location.
I'm retrieving data from an URL using curl.
Everything works fine if the php code is called via a HTTP request or if the URL is entered in Firefox. If the very same code is executed from a PHP CLI script curl_exec returns false. The error message is "Failure when receiving data from the peer".
Any ideas why curl is not working?
When I set the curl output to verbose I get:
Setting curl to verbose gives:
< HTTP/1.1 200 OK
< Server: Apache-Coyote/1.1
< Last-Modified: Mon, 01 Aug 2011 13:04:59 GMT
< Cache-Control: no-store
< Cache-Control: no-cache
< Cache-Control: must-revalidate
< Cache-Control: pre-check=0
< Cache-Control: post-check=0
< Cache-Control: max-age=0
< Pragma: no-cache
< Expires: Thu, 01 Jan 1970 00:00:00 GMT
< Content-Type: text/xml
< Transfer-Encoding: chunked
< Date: Mon, 01 Aug 2011 13:04:58 GMT
<
* Trying 153.46.254.70... * Closing connection #0
* Failure when receiving data from the peer
This is the Code:
// if curl is not installed we trigger an alert, and exit the function
if (!function_exists('curl_init')){
watchdog('sixtk_api', 'curl is not installed, api call cannot be executed',array(),WATCHDOG_ALERT);
return $this;
}
// OK cool - then let's create a new cURL resource handle
$ch = curl_init();
// Set URL to download
curl_setopt($ch, CURLOPT_URL, $this->request);
// Should cURL return or print out the data? (true = return, false = print)
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
// Timeout in seconds
curl_setopt($ch, CURLOPT_TIMEOUT, 180);
// Download the given URL, and return output
$output = curl_exec($ch);
if (!$output) {
$error = curl_error($ch);
echo($error);
}
// Close the cURL resource, and free system resources
curl_close($ch);
try wget . if that fails too but you can access the address from another IP/device , this probably means your IP is being blocked or filtered out by either firewall/nginx anti ddos attack . try proxy .
I am attempting to load each url in a sitemap.xml file in an effort to pre-cache them and speed up the users experience.
I have the following code which grabs the urls from the sitemap
$ch = curl_init();
/**
* Set the URL of the page or file to download.
*/
curl_setopt($ch, CURLOPT_URL, 'http://onlineservices.letterpart.com/sitemap.xml;jsessionid=1j1agloz5ke7l?id=1j1agloz5ke7l');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$data = curl_exec ($ch);
curl_close ($ch);
$xml = new SimpleXMLElement($data);
foreach ($xml->url as $url_list) {
$url = $url_list->loc;
echo $url ."<br>";
}
and I am now trying to use fsockopen to load each url in turn.
where $url is in this format: http://onlineservices.letterpart.com:80/content/en/FAMILY-201103311115/Family_FLJONLINE_FLJ_2009_07_4
foreach ($xml->url as $url_list) {
$url = $url_list->loc;
$fp = fsockopen ($url,80);
if ($fp) {
fwrite($fp, "GET / HTTP/1.1\r\nHOST: $url\r\n\r\n");
while (!feof($fp)) {
print fread($fp,256);
}
fclose ($fp);
} else {
print "Fatal error\n";
}
}
But this is giving me this error for each url:
[12-May-2011 13:34:09] PHP Warning: fsockopen() [function.fsockopen]: unable to connect to http://onlineservices.letterpart.com:80/content/en/FAMILY-201103311115/Family_FLJONLINE_FLJ_2009_07_4:-1 (Unable to find the socket transport "http" - did you forget to enable it when you configured PHP?) in /home/digital1/public_html/dev/sitemap.php on line 32
I have read that I need to: "just the hostname, not the URL in the fsockopen call. You'll need to provide the uri, minus the host/port in the actual HTTP headers"
so I tried this:
$fp = fsockopen ("http://onlineservices.letterpart.com",80);
if ($fp) {
fwrite($fp, "GET / HTTP/1.1\r\nHOST: content/en/FAMILY-201103311115/Family_FLJONLINE_FLJ_2009_07_4\r\n\r\n");
while (!feof($fp)) {
print fread($fp,256);
}
fclose ($fp);
} else {
print "Fatal error\n";
}
But I still get the same error.
EDIT:
If I change the fsockopen call to:
$fp = fsockopen ("onlineservices.letterpart.com",80);
then I get a slightly different and better but still wrong response. it seems to be ignoring the onlineservices.letterpart.com section and trying http:///content/ BUT... it has appended: /web/ui.xql?action=html&resource=login.html tot he end of the url which is our login page so it must be seeing our server...
HTTP/1.1 302 Moved Temporarily Date: Thu, 12 May 2011 14:40:02 GMT Server: Jetty/5.1.12 (Windows 2003/5.2 x86 java/1.6.0_07 Expires: Thu, 01 Jan 1970 00:00:00 GMT Set-Cookie: JSESSIONID=nh62zih3q8mf;Path=/ Location: http:///content/en/FAMILY-201103311115/Family_FLJONLINE_FLJ_2009_07_4/web/ui.xql?action=html&resource=login.html Content-Length: 0
Thanks.
fsockopen is not attented to be used for HTTP request,
Curl is a better choice (and much more powerful).
There is also file_get_contents which can make it quick:
foreach ($xml->url as $url_list) {
$url = $url_list->loc;
file_get_contents($url);
}
Usefull for application cache warmup!