i added the plugin from geoplugin.com to my page, i activated my domain, but in my errorlog i get errors all the time like this:
[20-Mar-2013 11:17:13 CET] PHP Warning: file_get_contents(http://www.geoplugin.net/php.gp?ip=157.55.34.183&base_currency=USD) [<a href='function.file-get-contents'>function.file-get-contents</a>]: failed to open stream: HTTP request failed! HTTP/1.1 403 Forbidden
in /wwwroot.wwwnew/templates/geo/geoplugin.class.php on line 105
How can i solve this problem?
If geoplugin.net was responding perfectly, then stopped, then you have
gone over the free lookup limit of 120 requests per minute.
We automatically block all requests coming from an IP or a domain name
if the number of requests exceeds 120 lookups per minute. This is
explained in our Acceptable Use Policy. This block comes in the form
of a HTTP/1.1 403 Forbidden reply from geoplugin.net
This block is automatically removed 1 hour after the last time your
server stopped sending more than 120 requests a minute.
- GeoPlugin FaQ
If you need more than 120 requests per minute, you have to whitelist your domain (it's paid).
Related
Background: I have a 250 GB object storage at Dreamhost. I am using the AWS S3 Client (PHP) for uploading files to it. It worked fine for months until they migrated their server from the West Coast to the East Coast. The only changes (very small and simple) to my scripts was a new Host URL/region. My bucket has around 1 million photos/thumbnails of around 10kb-100kb in size on average.
Since then for 2 months, some photos will upload fine, and then half the time, uploading a photo will result in 400/500 errors. We contacted Dreamhost Support and they have been absolutely stumped for 2 months - no answer to the problem. Here are type of errors in our logs:
[05-Dec-2018 12:28:27 UTC] PHP Fatal error: Uncaught exception 'Aws\S3\Exception\S3Exception' with message 'Error executing "PutObject" on "https://objects-us-east-1.dream.io/mybucket/img.jpg"; AWS HTTP error: Client error: `PUT https://objects-us-east-1.dream.io/mybucket/img.jpg` resulted in a `408 Request Time-out` response:
<html><body><h1>408 Request Time-out</h1>
Your browser didn't send a complete request in time.
</body></html>
(client): - <html><body><h1>408 Request Time-out</h1>
Your browser didn't send a complete request in time.
</body></html>
'
GuzzleHttp\Exception\ClientException: Client error: `PUT https://objects-us-east-1.dream.io/mybucket/img.jpg` resulted in a `408 Request Time-out` response:
<html><body><h1>408 Request Time-out</h1>
Your browser didn't send a complete request in time.
</body></html>
in /home/username/mysite.com/includes/cdn/aws/GuzzleHttp/Exception/RequestException.php:113
in /home/username/mysite.com/includes/cdn/aws/Aws/WrappedHttpHandler.php on line 191
[05-Dec-2018 12:44:21 UTC] PHP Fatal error: Uncaught exception 'Aws\S3\Exception\S3Exception' with message 'Error executing "PutObject" on "https://objects-us-east-1.dream.io/mybucket/img.jpg"; AWS HTTP error: cURL error 28: Operation timed out after 0 milliseconds with 0 out of 0 bytes received (see http://curl.haxx.se/libcurl/c/libcurl-errors.html)'
GuzzleHttp\Exception\ConnectException: cURL error 28: Operation timed out after 0 milliseconds with 0 out of 0 bytes received (see http://curl.haxx.se/libcurl/c/libcurl-errors.html) in /home/username/mysite.com/includes/cdn/aws/GuzzleHttp/Handler/CurlFactory.php:186
Stack trace:
#0 /home/username/mysite.com/includes/cdn/aws/GuzzleHttp/Handler/CurlFactory.php(150): GuzzleHttp\Handler\CurlFactory::createRejection(Object(GuzzleHttp\Handler\EasyHandle), Array)
#1 /home/username/mysite.com/includes/cdn/aws/GuzzleHttp/Handler/CurlFactory.php(103): GuzzleHttp\Handler\CurlFactory::finishError(Object(GuzzleHttp\Handler\CurlMultiHandler), Object(GuzzleHttp\H in /home/username/mysite.com/includes/cdn/aws/Aws/WrappedHttpHandler.php on line 191
In an attempt to narrow down the problem, I've also done the simplest of examples like listing buckets (Dreamhost tutorial examples), and the same behavior happens - even on a new test bucket with 1 image in it. If I refresh the browser once every few seconds it might list the buckets 2-3 times successfully, but on the 4th refresh, the page continues to "hang" for a long time and it might finally display the bucket after a 150 seconds delay, or the script might just timeout. Dreamhost noticed the same thing when they set up an example on a basic cloud server instance: the bucket list might load immediately, or after 60 seconds, 120 seconds, 180 seconds, etc. A clue: seems like it loads just after 30 second increments. (these 180, 150, 120, 60 times are all divisible by 30).
I'm hoping someone understands what is happening here. The problem is so bad that we have hundreds of unhappy merchants in our marketplace that are having a hard time listing new products for sale, because of this image uploading issue making it nearly impossible for them to list images and is causing their browsers to "hang". To make matters worse, these image uploading timeout issues are causing all my 40 PHP processes to timeout, which indirectly causes 500 Internal Server Errors for site visitors as well. Our site doesn't have that much traffic, maybe 10,000 visitors per day. Again, it is surprising that Dreamhost has been stumped for months, they say I'm the only customer they have that has the issue.
Other info, my server is running on:
Ubuntu 16.04
Apache 2.4.33
PHP-FPM (7.0)
cURL 7.47.0
AWS S3 SDK for PHP 3.81.0
Have HTTPS and HTTP/2 enabled
My PHP script runs okay when my traffic is 5000 visitor/ DAY - no error.
When I suddenly increase my traffic spent to 3000 visitor/ HOUR I get this many errors:
[26-May-2017 07:30:03 Asia/Jakarta] PHP Warning: file_get_contents(http://mydomain1.com/api/ip2country_v6/?ip=76.119.xxx.xxx): failed to open stream: HTTP request failed! in /home/h32xxx/mydomain2.com/landings/script.php on line 47
This php script is in my mydomain2.com
file_get_contents is requesting from mydomain1.com
My mydomain2.com and mydomain1.com is on 1 server account. I can't do 'http://localhost/~h32xxx/api/ip2country_v6/?ip=76.119.xxx.xxx' because my CPanel settings block it (mod_userdir is disabled)
What is the problem and how do I fix it?
I am fetching data from cleardb mysql. It takes around 10 mins to give me result back.
But after 230 secs Azure gives error as "500 - The request timed out.
The web server failed to respond within the specified time."
i have tried to set max_execution_timeout to infinite and changed more config variables in .user.ini.
As well tried to set manually in first line of php script file as set_time_limit(0); and ini_set('max_execution_time', 6000000);.
But no luck.
I don't want to use webjobs.
Is there any way to resolve Azure 500 - The request timed out. issue.
Won't work. You'll hit the in-flight request timeout long before the 10-min wait.
Here's a better approach. Call a stored procedure that produces the result and make a second call 10 minutes later to retrieve the data.
call the stored procedure from your code
return a Location: header in the response
follow the URL to grab results, 200 OK means you have them, 417 Expectation Failed means not yet.
This page of my site is taking 1 minute to load. Then I got this error:
"Warning:
simplexml_load_file(http://www.cpalead.com/dashboard/reports/campaign_rss.php?id=MyID&geoip=MyIP&show=6&offer_type=pinsubmit):
failed to open stream: Connection timed out in ../pass/pass.php on
line 125 Warning: simplexml_load_file()"
My code:
$call_url = 'http://www.cpalead.com/dashboard/reports/campaign_rss.php?id='.$user_id.'&geoip='.$ip.'&show=6'.'&offer_type=pinsubmit';
if($xml = simplexml_load_file($call_url, "SimpleXMLElement", LIBXML_NOCDATA))
What I've tried to fix that and didn't work:
Switch allow_url_fopen to ON in php.ini of my hosting.
set the max time out to 300 second instead of 120.
It's because the website has blocked your server, after they got too many requests from your server.
There is no solution until they unblock you voluntarily or upon your request (I am not sure they have a way to do that). But if you can move to another server, that can work, until they block that server also!
Golden rule is "Don't make too many requests to any specific site unless they have allowed you to do that by terms."
I am trying to get postal codes from this site:
http://pl.wikisource.org/wiki/Lista_kod%C3%B3w_pocztowych_w_Polsce
My code is simple:
<?php
$postalCode = $_GET['code'];
$httpAddr = 'http://pl.wikisource.org/wiki/Lista_kod%C3%B3w_pocztowych_w_Polsce/Okr%C4%99g_'.$postalCode[0].'_'.$postalCode[0].$postalCode[1].'-xxx';
file_get_contents($httpAddr);
?>
But when i set $postalCode to 03-000 (also 01-000, 05-000, but for 07-000, 61-000, 62-000 is working) i am reciving error:
Warning: file_get_contents(http://pl.wikisource.org/wiki/Lista_kod%C3%B3w_pocztowych_w_Polsce/Okr%C4%99g_0_03-xxx): failed to open stream: HTTP request failed! HTTP/1.0 403 Forbidden in /var/www/clients/client1/web4/web/ofix/test.php on line 5
Page address is correct, you can copy and past it in your web browser and it works.
Any ideas?
As Lightness Races in Orbit suspected, it does seem that the webserver is blocking PHP's request.
Using cURL instead of file_get_contents() reveals the details:
HTTP/1.0 403 Forbidden
Scripts should use an informative User-Agent string with contact information, or they may be IP-blocked without notice.
A web browser sends a valid User-Agent header in its request, which is why the page loads OK in your browser but not in PHP.
In my tests loading this URL in PHP, sometimes it succeeds with an HTTP status code of 200, other times it fails with 403. Notice that the error message says scripts may be blocked (ie. sometimes they may not be blocked).
Edit
See this question for more info: How to get results from the Wikipedia API with PHP?