MagpieRSS: Failed to fetch error - php

When my script parses an RSS feed from my local Apache server, it parses fine but when I upload script to the remote hosting server it gives the error:
Warning: Magpie RSS: Failed to fetch (url) and cache is off in magpierss-0.6/rss_fetch.inc
on line 231.
I have searched for possible answers and any suggestions are around enabling and changing cache lifetime. I think that is not the problem - it looks like the problem is with remote hosting Apache server or denied access to my host.
Can anyone help?

check that magpie can open the rss url. Usually with curl being enabled or allow_url_fopen on your server. Also make sure that magpie is able/allowed/permitted to create caching files (if any)

This could be caused by a bad url, the server is busy or not having remote file access allowed in your host.
$errormsg = "Failed to fetch $url "; in source indicative of problem getting to host.
Try checking access from the remote host - if you have ssh & curl enabled just do
curl (url)

Related

Unable to communicate with *.roblox.com from PHP Google App Engine

I've done extensive testing, enabled verbose cURL logging (Leaves no logs, gives a generic cURL error #7), tried using the built in handlers through file_get_contents. (Also errored, see below) It seems no matter what if I attempt to request information from anything on the roblox.com domain from my app it gets errored before it can even try. I know it is not the distant end as multiple other sites are working fine aswell as I've used an alternate host to try the same communications that I'm doing with Google App Engine and it worked without any issue. At this point I can only conclude that Google has banned my app from communicating with the ROBLOX website without giving me any indication of any kind. If this is true, why is my app banned, and more importantly, why wasn't I alerted?
cURL output with verbose logging enabled:
https://api.roblox.com/users/get-by-username?username=christbru01
CURL Failed with error #7:
CURL HTTP CODE #0
CURL INFO: 0
This is the code which generated these:
syslog(LOG_DEBUG,(string)$newurl);
syslog(LOG_WARNING,'CURL Failed with error #'.curl_errno($s).": ".curl_error($s));
syslog(LOG_DEBUG,'CURL HTTP CODE #'.curl_getinfo($s,CURLINFO_HTTP_CODE));
syslog(LOG_DEBUG,'CURL INFO: '.curl_getinfo($s,CURLINFO_HTTP_CONNECTCODE));
file_get_contents output:
file_get_contents(https://api.roblox.com/users/get-by-username?username=Christbru01): failed to open stream: Connection error
This is the code which generated this:
echo file_get_contents("link removed due to insufficient reputation");
You need to enable cURL in your instance by adding google_app_engine.enable_curl_lite = "1" to your php.ini file.
https://cloud.google.com/appengine/docs/php/config/php_ini

file_get_contents not working on different servers

I am trying to simply use file_get_contents() to get content of http://www.google.com/search?hl=en&tbm=nws&authuser=0&q=Pakistan with same code on 2 different servers. One is getting every thing file while other is getting 403 error. I am unable to know what exactly the reason is. I used phpinfo() on both servers.
One difference I observe is that one use apache2 while other use some other HTTP server named LiteSpeed V6.6. But i don't know how if it affect this file_get_contents() method. For more detail you can see their phpinfo() page link below.
Where file_get_contents getting 403 the phpinfo is; http://zavahost.com/newsreader/phpinfo.php
while where it is working file , here is the phpinfo: http://162.243.5.14/info.php
I will be thankful if someone can tell that what is effecting file_get_contents()? Please let me know if any idea?
403 is an Unauthorized Error. That means you lack sufficient permission to connect to the content at that server. I'm not sure if this could be due to the inability to fetch data from your hosting provider, but it could also be denied based on header information the remote server has flagged as unauthorized.
Try using the answer on this post: php curl: how can i emulate a get request exactly like a web browser? to curl the same data from the server that is getting the 403

PHP file_get_contents sudden failure on server

I was using file_get_contents() to fetch information from an external URL. It was working perfectly on the server before. But now, somehow it fails to work on the server. (No changes in the codes.) It keeps giving me the error: failed to open stream: Connection timed out.
I have tested it on localhost and it works perfectly. I have checked allow_url_fopen option, it is still On.
So, what could be the reason(s)?
file_get_contents does not work well at all with getting remote files and should not be used. It does not deal with slow network connections or redirects, and does not return error codes. You should use curl instead to fetch remote files.
There is an example in the manual for curl_exec:
http://us3.php.net/manual/en/function.curl-exec.php

Why would domdocument load work on dev sie but give an error on production?

I am stumped. The code is exactly the same and they are both hosted by the same rackspace . . . yet one works fine and the other generates an error:
[domdocument.load]: failed to open stream: HTTP request failed!
The code is fairly simple:
$doc = new DOMDocument();
$page_url = **valid XML feed**
$doc->load($page_url);
So I do not understand what is causing the error . . . Any ideas on what to check?
Also: everything was working fine until yesterday - so it must have been something in one of the (unfortunately very numerous) patches implemented yesterday. Just need an Idea of where to start looking
Some investigating reveals it is the feed's problem - (Indeed's job API) - switched to a different job feed provider & everything is working, still confused why it would work in dev but not production (unless indeed blocked us for some reason)
URL wrappers on your system might be disabled. Please can you check the value of the PHP ini variable allow_url_fopen? If it's disabled you won't be allowed to fetch files from URLs.
echo ini_get('allow_url_fopen');
Maybe:
http://www.php.net/manual/en/domdocument.load.php#91384
Jonas Due Vesterheden 09-Jun-2009 03:18
I had a problem with loading documents over HTTP. I would get errors looking like this:
Warning: DOMDocument::load(http://external/document.xml): failed to open stream: HTTP request failed! HTTP/1.1 500 Internal Server Error
The document would load fine in browsers and using wget. The problem is that DOMDocument::load() on my systems (both OS X and Linux) didn't send any User-Agent header which for some weird reason made Microsoft-IIS/6.0 respond with the 500 error.
If you're using a remote feed (not on a machine local to each server) then chances are it's due to something related with allow_url_fopen. Usually loading files remotely as if they were a local resource is disabled for security reasons.
OK I found the problem: IP address was being blocked by the api provider . . . thanks for the advice anyways.

php file_get_contents not working on real server! does work on localhost?

hey guys,
i developed a website on my local apache setup on my mac. I'm using two requests to foreign domains. One goes out to geoplugin.net to get the current geolocation.
This works just fine on my local setup. However when I transfer the files to my real server the website prints the following:
Warning:
file_get_contents(http://www.geoplugin.net/php.gp?ip=185.43.32.341)
[function.file-get-contents]: failed
to open stream: HTTP request failed!
HTTP/1.0 403 Forbidden in
/home/.sites/74/site484/web/testsite/wp-content/themes/test/header.php
on line 241
what can I do here? What am I doing wrong?
Furthermore I'm using a curl request on my website which doesn't retrieve data as well. Both works fine on my local mamp setup.
any ideas?
The server responds with an "403 FORBIDDEN" status code. So file_get_contents() works fine, but the server you are trying to access (or a proxy or something in between) dont allow it.
This can have many reasons. For example (like the comment of the question) you are banned, or blocked (because of to much requests), or something.
HTTP/1.0 403 Forbidden
means you are not allowed to access this files! Try to add an user agent header.
You need to create an account at geoplugin.com and subscribe your domain to use the webservice without limitation, then you will stop to receive de 403 forbidden error. Don't worry about costs, it's a free service, i'm using it in three sites.
try to urlencode the query string.
also I would recommend using curl extension.
That is because geoPlugin is limited to 120 lookups per minute.
http://www.geoplugin.com/premium
So, any web-site feature based on this solution can be damaged suddenly.
I would recommend to use both www.geoplugin.net/json.gp?ip={ip} and freegeoip.net/json/{ip} . And check if first one returns null (means that limit already reached) then use another one.

Categories