What
I'm trying to do a pretty basic ftp reading in Heroku.
Code:
$ftp = ftp_connect($config['domain']);
$login = ftp_login($ftp,$config['ftp_user'],$config['ftp_pass']);
ftp_pasv($ftp, true);
$contents = ftp_nlist($ftp, "/");
for ($i = 0 ; $i < count($contents) ; $i++)
echo "<li>" . substr($contents[$i],1) . "</li>";
ftp_close($ftp);
Heroku Error
2015-05-19T07:26:01.678102+00:00 heroku[router]: at=error code=H12 desc="Request timeout" method=GET path="/" host=xxx-ftp.herokuapp.com request_id=xxxxx-364a-48f8-8e2a-383affb0789f fwd="xx.12.8.106" dyno=web.1 connect=0ms service=30000ms status=503 bytes=0
2015-05-19T07:26:31.739770+00:00 app[web.1]: [Tue May 19 07:26:31.738789 2015] [proxy_fcgi:error] [pid 186:tid 140199224919808] (70007)The timeout specified has expired: [client 10.127.183.84:42907] AH01075: Error dispatching request to : (polling)
2015-05-19T07:26:31.853475+00:00 app[web.1]: [19-May-2015 07:26:31 UTC] PHP Warning: ftp_nlist(): php_connect_nonb() failed: Operation now in progress (115) in /app/list.php on line 26
Facts
The connection is ok (returning true)
Heroku is giving timeout only when ftp_nlist or ftp_rawlist is executed.
The login is ok
I've tried with and without PASIVE mode
Other basic commands as PWD are working ok
I've also tried by listing a more specific folder e.g: ftp_nlist($conn,"/MyFolder");
The amount of folder I'm trying to read and list is small.
The FTP server is responding in less than 2sec from other FTP clients
The same code in other server (not heroku) works fine.
This seems to work just fine for ftp.mozilla.org for instance:
$ftp = ftp_connect('ftp.mozilla.org');
$login_result = ftp_login($ftp, 'anonymous', '');
ftp_pasv($ftp, true);
var_dump(ftp_nlist($ftp, "/pub/mozilla.org/"));
Are you maybe hitting https://bugs.php.net/bug.php?id=55651 with your destination FTP behind NAT?
Related
I am having an odd experience with Heroku hosting my Laravel app/API
Locally everything looks good but recently requests are slow or timing out.
I still get results (most of the time) but the logs show the below;
2021-04-07T14:07:08.267681+00:00 app[web.1]: [07-Apr-2021 14:07:08] WARNING: [pool www] child 145, script '/app/public/index.php' (request: "GET /index.php") executing too slow (3.062416 sec), logging
2021-04-07T14:07:08.268185+00:00 app[web.1]:
2021-04-07T14:07:08.268269+00:00 app[web.1]: [07-Apr-2021 14:07:08] [pool www] pid 145
2021-04-07T14:07:08.268346+00:00 app[web.1]: script_filename = /app/public/index.php
2021-04-07T14:07:08.268538+00:00 app[web.1]: [0x00007f4cb9618040] execute() /app/vendor/doctrine/dbal/lib/Doctrine/DBAL/Driver/PDOStatement.php:112
2021-04-07T14:07:08.268729+00:00 app[web.1]: [0x00007f4cb9617fc0] execute() /app/vendor/laravel/framework/src/Illuminate/Database/Connection.php:343
2021-04-07T14:07:08.269002+00:00 app[web.1]: [0x00007f4cb9617f20] Illuminate\Database\{closure}() /app/vendor/laravel/framework/src/Illuminate/Database/Connection.php:671
2021-04-07T14:07:08.269207+00:00 app[web.1]: [0x00007f4cb9617e60] runQueryCallback() /app/vendor/laravel/framework/src/Illuminate/Database/Connection.php:638
2021-04-07T14:07:08.269390+00:00 app[web.1]: [0x00007f4cb9617da0] run() /app/vendor/laravel/framework/src/Illuminate/Database/Connection.php:346
2021-04-07T14:07:08.269589+00:00 app[web.1]: [0x00007f4cb9617d10] select() /app/vendor/laravel/framework/src/Illuminate/Database/Query/Builder.php:2313
2021-04-07T14:07:08.269788+00:00 app[web.1]: [0x00007f4cb9617ca0] runSelect() /app/vendor/laravel/framework/src/Illuminate/Database/Query/Builder.php:2301
2021-04-07T14:07:08.270038+00:00 app[web.1]: [0x00007f4cb9617bd0] Illuminate\Database\Query\{closure}() /app/vendor/laravel/framework/src/Illuminate/Database/Query/Builder.php:2796
2021-04-07T14:07:08.270249+00:00 app[web.1]: [0x00007f4cb9617b30] onceWithColumns() /app/vendor/laravel/framework/src/Illuminate/Database/Query/Builder.php:2302
2021-04-07T14:07:08.270443+00:00 app[web.1]: [0x00007f4cb9617a50] get() /app/vendor/laravel/framework/src/Illuminate/Database/Eloquent/Builder.php:588
2021-04-07T14:07:08.270650+00:00 app[web.1]: [0x00007f4cb9617920] getModels() /app/vendor/laravel/framework/src/Illuminate/Database/Eloquent/Builder.php:572
2021-04-07T14:07:08.270837+00:00 app[web.1]: [0x00007f4cb9617820] get() /app/vendor/laravel/framework/src/Illuminate/Database/Eloquent/Model.php:477
2021-04-07T14:07:08.270978+00:00 app[web.1]: [0x00007f4cb96177a0] all() /app/vendor/tcg/voyager/routes/voyager.php:39
2021-04-07T14:07:08.271163+00:00 app[web.1]: [0x00007f4cb96176d0] {closure}() /app/vendor/laravel/framework/src/Illuminate/Routing/Router.php:423
2021-04-07T14:07:08.271362+00:00 app[web.1]: [0x00007f4cb9617650] loadRoutes() /app/vendor/laravel/framework/src/Illuminate/Routing/Router.php:382
2021-04-07T14:07:08.271552+00:00 app[web.1]: [0x00007f4cb96175b0] group() /app/vendor/laravel/framework/src/Illuminate/Support/Facades/Facade.php:261
2021-04-07T14:07:08.271709+00:00 app[web.1]: [0x00007f4cb9617500] __callStatic() /app/vendor/tcg/voyager/routes/voyager.php:133
2021-04-07T14:07:08.271894+00:00 app[web.1]: [0x00007f4cb9617470] {closure}() /app/vendor/laravel/framework/src/Illuminate/Routing/Router.php:423
2021-04-07T14:07:08.272080+00:00 app[web.1]: [0x00007f4cb96173f0] loadRoutes() /app/vendor/laravel/framework/src/Illuminate/Routing/Router.php:382
2021-04-07T14:07:08.272276+00:00 app[web.1]: [0x00007f4cb9617350] group() /app/vendor/laravel/framework/src/Illuminate/Support/Facades/Facade.php:261
2021-04-07T14:07:09.811958+00:00 app[web.1]: 99.99.99.99 - - [07/Apr/2021:14:07:05 +0000] "GET /api/v1/mentors HTTP/1.1" 200 7278 "-" "PostmanRuntime/7.26.10
2021-04-07T14:07:09.819006+00:00 heroku[router]: at=info method=GET path="/api/v1/mentors" host=xxxxxxxxxxxxxx.herokuapp.com request_id=17d1f729-e3a2-41dc-8c92-f0b917623b74 fwd="999.999.999.99" dyno=web.1 connect=1ms service=4615ms status=200 bytes=7586 protocol=https
But the other times things are snappy and I just get these types of logs on a 1-2 response time;
2021-04-07T14:15:37.182510+00:00 app[web.1]: 99.99.99.99 - - [07/Apr/2021:14:15:36 +0000] "GET /api/v1/mentors HTTP/1.1" 200 7278 "-" "PostmanRuntime/7.26.10
2021-04-07T14:15:37.183580+00:00 heroku[router]: at=info method=GET path="/api/v1/mentors" host=xxxxxxxxxxxxxxxxxxx.herokuapp.com request_id=0b8ccc3f-8e3f-44bf-91c0-71ef6d245ad2 fwd="999.999.999.999" dyno=web.1 connect=0ms service=300ms status=200 bytes=7586 protocol=https
2021-04-07T14:15:47.637135+00:00 app[web.1]: 99.99.99.99 - - [07/Apr/2021:14:15:47 +0000] "GET /api/v1/mentors HTTP/1.1" 200 7278 "-" "PostmanRuntime/7.26.10
2021-04-07T14:15:47.638294+00:00 heroku[router]: at=info method=GET path="/api/v1/mentors" host=xxxxxxxxxxxxxxxxxxx.herokuapp.com request_id=8b415575-eb68-4fd4-87df-bca90f5fbf26 fwd="999.999.999.999" dyno=web.1 connect=0ms service=454ms status=200 bytes=7586 protocol=https
2021-04-07T14:15:51.658485+00:00 heroku[router]: at=info method=GET path="/api/v1/mentors" host=xxxxxxxxxxxxxxxxxxx.herokuapp.com request_id=c91c02d4-5141-4002-81aa-05507c0ea3f9 fwd="999.999.999.999" dyno=web.1 connect=1ms service=729ms status=200 bytes=7586 protocol=https
2021-04-07T14:15:51.655872+00:00 app[web.1]: 99.99.99.99 - - [07/Apr/2021:14:15:50 +0000] "GET /api/v1/mentors HTTP/1.1" 200 7278 "-" "PostmanRuntime/7.26.10
I thought it might be some queries with a number of relationships defined but even basic lookups on a single table is giving the same varied responses.
I also looked at the configureRateLimiting value and increased that from 60 to 10000 but nothing seems to change.
Anybody experienced similar? any ideas where to start to troubleshoot this weirdness?
Chris
I'm also having the same issue. However, this issue only occurs when requesting to the website the first time after I deploy changes to heroku. Subsequent requests after that are okay.
When slow execution is detected, function calls or scripts are logged. In your case the app starts to get slow when it reaches database connection part. Quoting from your log:
execute() /app/vendor/doctrine/dbal/lib/Doctrine/DBAL/Driver/PDOStatement.php:112
execute() /app/vendor/laravel/framework/src/Illuminate/Database/Connection.php:343
...
Perhaps you are trying to connect to a remote database? (other database provider aside from heroku). You can try installing composer package barryvdh/laravel-debugbar in your development environment to see how many queries you call, if there are any duplicates and how long does the queries took before loading the page.
Here is the logs on my end:
2021-11-18T06:46:17.662199+00:00 app[api]: Release v523 created by user bk2o1.syndicates#gmail.com
2021-11-18T06:46:18.304816+00:00 heroku[web.1]: State changed from down to starting
2021-11-18T06:46:18.355707+00:00 heroku[worker.1]: State changed from down to starting
2021-11-18T06:46:25.941551+00:00 heroku[worker.1]: Starting process with command `php artisan queue:work`
2021-11-18T06:46:26.562345+00:00 heroku[worker.1]: State changed from starting to up
2021-11-18T06:46:28.137180+00:00 heroku[web.1]: Starting process with command `vendor/bin/heroku-php-apache2 public/`
2021-11-18T06:46:30.999737+00:00 app[web.1]: DOCUMENT_ROOT changed to 'public/'
2021-11-18T06:46:31.213248+00:00 app[web.1]: Detected 536870912 Bytes of RAM
2021-11-18T06:46:31.279814+00:00 app[web.1]: PHP memory_limit is 128M Bytes
2021-11-18T06:46:31.279869+00:00 app[web.1]: Starting php-fpm with 4 workers...
2021-11-18T06:46:31.354604+00:00 app[web.1]: Starting httpd...
2021-11-18T06:46:31.868967+00:00 heroku[web.1]: State changed from starting to up
2021-11-18T06:46:59.375247+00:00 app[web.1]: [18-Nov-2021 06:46:59] WARNING: [pool www] child 146, script '/app/public/index.php' (request: "GET /index.php") executing too slow (3.248194 sec), logging
2021-11-18T06:46:59.375717+00:00 app[web.1]:
2021-11-18T06:46:59.375797+00:00 app[web.1]: [18-Nov-2021 06:46:59] [pool www] pid 146
2021-11-18T06:46:59.375873+00:00 app[web.1]: script_filename = /app/public/index.php
2021-11-18T06:46:59.376056+00:00 app[web.1]: [0x00007f6d6dc1b4a0] curl_multi_select() /app/vendor/guzzlehttp/guzzle/src/Handler/CurlMultiHandler.php:160
2021-11-18T06:46:59.376216+00:00 app[web.1]: [0x00007f6d6dc1b3f0] tick() /app/vendor/guzzlehttp/guzzle/src/Handler/CurlMultiHandler.php:183
2021-11-18T06:46:59.376355+00:00 app[web.1]: [0x00007f6d6dc1b370] execute() /app/vendor/guzzlehttp/promises/src/Promise.php:248
2021-11-18T06:46:59.376501+00:00 app[web.1]: [0x00007f6d6dc1b2d0] invokeWaitFn() /app/vendor/guzzlehttp/promises/src/Promise.php:224
2021-11-18T06:46:59.376648+00:00 app[web.1]: [0x00007f6d6dc1b250] waitIfPending() /app/vendor/guzzlehttp/promises/src/Promise.php:269
2021-11-18T06:46:59.376797+00:00 app[web.1]: [0x00007f6d6dc1b1c0] invokeWaitList() /app/vendor/guzzlehttp/promises/src/Promise.php:226
2021-11-18T06:46:59.376940+00:00 app[web.1]: [0x00007f6d6dc1b140] waitIfPending() /app/vendor/guzzlehttp/promises/src/Promise.php:62
2021-11-18T06:46:59.377110+00:00 app[web.1]: [0x00007f6d6dc1b0b0] wait() /app/vendor/microsoft/azure-storage-blob/src/Blob/BlobRestProxy.php:3018
2021-11-18T06:46:59.377313+00:00 app[web.1]: [0x00007f6d6dc1b020] getBlobProperties() /app/vendor/league/flysystem-azure-blob-storage/src/AzureBlobStorageAdapter.php:240
2021-11-18T06:46:59.377510+00:00 app[web.1]: [0x00007f6d6dc1aef0] getMetadata() /app/vendor/league/flysystem-azure-blob-storage/src/AzureBlobStorageAdapter.php:153
2021-11-18T06:46:59.377670+00:00 app[web.1]: [0x00007f6d6dc1ae80] has() /app/vendor/league/flysystem-cached-adapter/src/CachedAdapter.php:212
2021-11-18T06:46:59.377801+00:00 app[web.1]: [0x00007f6d6dc1add0] has() /app/vendor/league/flysystem/src/Filesystem.php:58
2021-11-18T06:46:59.377981+00:00 app[web.1]: [0x00007f6d6dc1ad50] has() /app/vendor/laravel/framework/src/Illuminate/Filesystem/FilesystemAdapter.php:110
2021-11-18T06:46:59.378080+00:00 app[web.1]: [0x00007f6d6dc1ace0] exists() /app/app/Models/Image.php:39
2021-11-18T06:46:59.378317+00:00 app[web.1]: [0x00007f6d6dc1ac20] getOriginalUrlAttribute() /app/vendor/laravel/framework/src/Illuminate/Database/Eloquent/Concerns/HasAttributes.php:527
2021-11-18T06:46:59.378532+00:00 app[web.1]: [0x00007f6d6dc1ab90] mutateAttribute() /app/vendor/laravel/framework/src/Illuminate/Database/Eloquent/Concerns/HasAttributes.php:541
2021-11-18T06:46:59.378762+00:00 app[web.1]: [0x00007f6d6dc1ab00] mutateAttributeForArray() /app/vendor/laravel/framework/src/Illuminate/Database/Eloquent/Concerns/HasAttributes.php:163
2021-11-18T06:46:59.378960+00:00 app[web.1]: [0x00007f6d6dc1aa60] attributesToArray() /app/vendor/laravel/framework/src/Illuminate/Database/Eloquent/Model.php:1403
2021-11-18T06:46:59.379155+00:00 app[web.1]: [0x00007f6d6dc1a990] toArray() /app/vendor/laravel/framework/src/Illuminate/Collections/Traits/EnumeratesValues.php:828
2021-11-18T06:46:59.379373+00:00 app[web.1]: [0x00007f6d6dc1a910] Illuminate\Support\Traits\{closure}() /app/vendor/laravel/framework/src/Illuminate/Collections/Collection.php:642
2021-11-18T06:47:26.126488+00:00 heroku[router]: at=error code=H12 desc="Request timeout" method=GET path="/" host=my-website-url-obviously.herokuapp.com request_id=825cada4-f0e1-4019-9edf-963bb47c995c fwd="112.200.130.157" dyno=web.1 connect=0ms service=30000ms status=503 bytes=0 protocol=https
2021-11-18T06:47:26.420440+00:00 app[web.1]: [18-Nov-2021 06:47:26] WARNING: [pool www] child 146, script '/app/public/index.php' (request: "GET /index.php") execution timed out (30.290064 sec), terminating
2021-11-18T06:47:26.983257+00:00 app[web.1]: 10.1.16.72 - - [18/Nov/2021:06:46:56 +0000] "GET / HTTP/1.1" 200 20063 "https://dashboard.heroku.com/" "Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/95.0.4638.69 Safari/537.36 Edg/95.0.1020.44
Which consists of calls using guzzle since I was fetching images from Azure:
getBlobProperties() /app/vendor/league/flysystem-azure-blob-storage/src/AzureBlobStorageAdapter.php:240
exists() /app/app/Models/Image.php:39
And on my App\Models\Image line 39, I was checking if image exists before actually fetching it. The issue came up when I eager loaded 62 images:
// in App\Models\Image.php
if(Storage::disk($this->disk)->exists($path.'/'.$filename)) {
// fetch the thing here
}
Solved by avoiding the check and just returning a generated url using storage facade
// in App\Models\Image.php
return Storage::disk($this->disk)->url($path.'/'.$filename).'?'.$this->updated_at?->format('U');
Another thing to consider is how much traffic your app is receiving when things started going slow. Data being displayed on page dynamically can really take up huge amount of memory depending on how much data is kept on memory during runtime and how many visitors are requesting for data (note: EVEN IF THE DATA IS CACHED). From my experience I also noticed sudden slow response times when bots are visiting the app.
This is not really a way to solve your issue but more of an overview of what might be going on under the hood. And most importantly the logs gives us a hint to what is happening and you can go from there.
I have the following code that is working perfectly on my local machine.
<?php
header("Content-Type:application/json");
require __DIR__.'/vendor/autoload.php';
use Kreait\Firebase\Factory;
use Kreait\Firebase\ServiceAccount;
// This assumes that you have placed the Firebase credentials in the same directory
// as this PHP file.
$serviceAccount = ServiceAccount::fromJsonFile(__DIR__.'/xxxx-xxxxx-xxxxxxxxxxxxx.json');
$firebase = (new Factory)
->withServiceAccount($serviceAccount)
->create();
$database = $firebase->getDatabase();
$newPost = $database
->getReference('blog/posts')
->push([
'title' => 'Post title',
'body' => 'This should probably be longer.'
]);
echo($newPost->getKey());
echo '{"ResultCode":0,"ResultDesc":"Confirmation received successfully"}';
?>
But When i upload the same code to heroku app and try to run the file url,i get a 500http error
What would i have forgotten. Because ive tried looking for a solution online but none has really helped. Ive tried reinstalling dependencies but still having the same problem.
The Following is my heroku logs error when i tried running the app today morning.
2018-10-27T05:15:35.613487+00:00 heroku[web.1]: Unidling
2018-10-27T05:15:35.613848+00:00 heroku[web.1]: State changed from down to starting
2018-10-27T05:15:37.230016+00:00 heroku[web.1]: Starting process with command `heroku-php-apache2`
2018-10-27T05:15:39.635790+00:00 app[web.1]: Optimizing defaults for 1X dyno...
2018-10-27T05:15:39.776372+00:00 app[web.1]: 4 processes at 128MB memory limit.
2018-10-27T05:15:39.781273+00:00 app[web.1]: Starting php-fpm...
2018-10-27T05:15:41.783183+00:00 app[web.1]: Starting httpd...
2018-10-27T05:15:41.925232+00:00 heroku[web.1]: State changed from starting to up
2018-10-27T05:15:43.694660+00:00 heroku[router]: at=info method=GET path="/rGU7qCmigsrVL4SuYNS01/conf irmation.php" host=cryptic-forest-94367.herokuapp.com request_id=cda944ce-b0ea-4cda-92f9-cba7c9b1c3e5 fwd="196.207.150.137" dyno=web.1 connect=0ms service=6ms status=500 bytes=161 protocol=https
2018-10-27T05:15:43.694391+00:00 app[web.1]: [27-Oct-2018 05:15:43 UTC] PHP Warning: require(/app/rG U7qCmigsrVL4SuYNS01/vendor/composer/../guzzlehttp/psr7/src/functions_include.php): failed to open str eam: No such file or directory in /app/rGU7qCmigsrVL4SuYNS01/vendor/composer/autoload_real.php on lin e 66
2018-10-27T05:15:43.695104+00:00 app[web.1]: [27-Oct-2018 05:15:43 UTC] PHP Fatal error: require(): Failed opening required '/app/rGU7qCmigsrVL4SuYNS01/vendor/composer/../guzzlehttp/psr7/src/functions_ include.php' (include_path='.:/app/.heroku/php/lib/php') in /app/rGU7qCmigsrVL4SuYNS01/vendor/compose r/autoload_real.php on line 66
2018-10-27T05:15:43.695646+00:00 app[web.1]: 10.5.165.56 - - [27/Oct/2018:05:15:43 +0000] "GET /rGU7q CmigsrVL4SuYNS01/confirmation.php HTTP/1.1" 500 - "-" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) Appl eWebKit/537.36 (KHTML, like Gecko) Chrome/69.0.3497.100 Safari/537.36
I am upgrading laravel web application from php5.6 to php7.1 which leads me to upgrading libevent module. Application is async ans based on react library.
So I ended with installed: PHP 7.1.12, libevent: 2.1.8 + expressif/pecl-event-libevent. And I have stable "502 Bad Gateway" from nginx. Without libevent (ReactStreamLoop) or on PHP 5.6 + libevent:1.4 works fine.
Request lands to index.php and something happens later, inside of starting application.
nginx log:
2017/11/24 10:41:24 [error] 24985#0: *7 recv() failed (104: Connection reset by peer) while reading response header from upstream, client: 14.183.16.180, server: 173.199.117.122, request: "GET / HTTP/1.1", upstream: "fastcgi://127.0.0.1:9000", host: "173.199.117.122"
2017/11/24 10:41:25 [error] 24985#0: *7 recv() failed (104: Connection reset by peer) while reading response header from upstream, client: 14.183.16.180, server: 173.199.117.122, request: "GET / HTTP/1.1", upstream: "fastcgi://127.0.0.1:9000", host: "173.199.117.122"
php-fpm log:
[24-Nov-2017 10:41:24] WARNING: [pool www] child 22300 exited on signal 11 (SIGSEGV) after 39.486978 seconds from start
[24-Nov-2017 10:41:24] NOTICE: [pool www] child 22331 started
[24-Nov-2017 10:41:25] WARNING: [pool www] child 22301 exited on signal 11 (SIGSEGV) after 40.205103 seconds from start
[24-Nov-2017 10:41:25] NOTICE: [pool www] child 22332 started
UPDATE: works if force usage ReactStreamLoop.
UPDATE: Reproduced locally on similar config. Found such example that crashes with LibEventLoop but works with StreamSelectLoop.
require_once __DIR__.'/../vendor/autoload.php';
$loop = new \React\EventLoop\LibEventLoop();
//$loop = new \React\EventLoop\StreamSelectLoop();
$config = array(
'host' => '127.0.0.1',
'port' => '3306',
'dbname' => 'mysql',
'user' => 'root',
'passwd' => 'root',
'charset' => 'utf8',
);
$client = new \React\MySQL\Connection($loop, $config);
$client->connect(function() {});
$client->query('select 8 as cnt', function () {
echo "inside\n";
});
echo "start\n";
$loop->run();
output:
#php ./tests/test.php
start
Segmentation fault (core dumped)
The extension you're using is not compatible with PHP 7 and up. You have to use one of the other event loop implementations, such as the one based on stream_select() or one of the supported extensions.
See https://github.com/reactphp/event-loop/pull/62 for further information.
I have looked through all the threads on this forum, before posting this one. None of the solutions were able to solve my issues so I am forced to open a new thread.
I have the below include in my code but it errors out with "failed to open stream: HTTP request failed! " error. I already have this set in php.ini:
allow_url_include = On and
allow_url_fopen = On
but it still fails.
1.Below is the include defined in /test/foo.php which includes the file on the same server under /test/bar.php
<div class="tab-content" style=''>
<div class="tab-pane active" id="my1"><?php include('http://'. $_SERVER['SERVER_NAME'] . ':' .
$_SERVER['SERVER_PORT'] . "/bar.php?env=test1&days=3&start=$start&end=$end");?></div>
</div>
2.Here is the dir structure:
a) /test/foo.php --> this has include to my own server.
b) /test/bar.php
3.Apache document root is pointing to /test like
/var/www/html --> /test
4.echo __DIR__ shows me "/test" so its definitely pointing to the right directory.
5.I have given full permission to this directory in case that's the issue, but no luck.
6.Exact error in apache error log for one of the above includes. It doesn't for any of the above includes. Server Name and port are intentionally removed from below log.
[Sun Jul 07 15:01:47 2013] [error] [client ] PHP Warning: include(http://:/bar.php?env=my1&days=3&start=2013-06-07&end=2013-07-07): failed to open stream: HTTP request failed! in /test/foo.php on line 36
[Sun Jul 07 15:01:47 2013] [error] [client ] PHP Warning: include(): Failed opening 'http://:/bar.php?env=my1&days=3&start=2013-06-07&end=2013-07-07' for inclusion (include_path='.:/usr/share/pear:/usr/share/php') in /test/foo.php on line 36
if
include('http://'. $_SERVER['SERVER_NAME'] . ':'
. $_SERVER['SERVER_PORT']
. "/bar.php?env=test1&days=3&start=$start&end=$end");
is trying to fetch from
http://:/bar.php?env=my1&days=3&start=2013-06-07&end=2013-07-07
It implies that $_SERVER['SERVER_NAME'] and $_SERVER['SERVER_PORT'] are blank. Are you running this from the command line?
(I assume you know that even with literal values you have created a big security hole in your system, that the resulting script will run massively slower than including the file directly, and that even if the variables were populated it doesn't mean that the routing / DNS on many hosts would allow you to access the code in this way)
I have this strange problem when i run an import cronjob in made.
It processes a 1.5GB XML file containing about 400.000 products. The script works fine and would take multiple hours to complete, but after about 500/600 seconds i get the following email from the cron-deamon.
PHP Warning: file_get_contents(http://test.nl/admin/cron_index.php?route=module/EZImport&cron): failed to open stream: HTTP request failed! HTTP/1.1 500 Internal Server Error
in /home/test.nl/public_html/admin/controller/tool/EZImport_cron.php on line 8
Warning:
file_get_contents(http://test.nl/admin/cron_index.php?route=module/EZImport&cron):
failed to open stream: HTTP request failed! HTTP/1.1 500 Internal
Server Error in
/home/test.nl/public_html/admin/controller/tool/EZImport_cron.php
on line 8 bool(false)
My apache error-logs say:
[Fri Nov 02 09:43:39 2012] [warn] [client 176.9..174] (104)Connection reset by peer: mod_fcgid: error reading data from FastCGI server
[Fri Nov 02 09:43:39 2012] [error] [client 176.9..174] Premature end of script headers: cron_index.php
This is the cron file called by the cronjob
require_once('../../config.php');
$opts = array('http' =>
array('timeout' => 36000)
);
$context = stream_context_create($opts);
$url = HTTP_SERVER."cron_index.php?route=module/cronMod&cron";
$result = file_get_contents($url, false, $context);
var_dump($result);
die();
I need to run this cron via file_get_contents
Environment:
DEBIAN,
OpenCart
The max execution time in webmin (php config) is set to 36000 seconds.
processes a 1.5GB XML file
Erk, this is a bit silly - you need a minimum of 2 passes to verify the document is well formed and there's lot's of scope for bad things to happen.
The max execution time in webmin (php config) is set to 36000 seconds
For which end?
You also need to configure the timeout for the webserver and every other component in the chain between client and server, however trying to transfer a 1.5Gb file over HTTP is just silly - you might get it to work - but it's not the right way to solve the problem. Break it up into more manageable chunks.
Try using set_time_limit(0) in the script or change the time limit in php.ini