I'm running a php in a unix shell and I'm getting the following error.
The script is a web scraper and it work fine on my host if I access it. But I want to run it as a cron job, What shoul I do?
Warning: file_get_contents(http://www.lomamatkat.fi/iframes/last-minute-offers/?lastminute_next=50): failed to open stream: no suitable wrapper could be found in /var/www/customers/lentovertailufi/public_html/matkat/script/lomamatkat.php on line 30
PHP Warning: file_get_contents(): URL file-access is disabled in the server configuration in /var/www/customers/lentovertailufi/public_html/matkat/script/lomamatkat.php on line 30
Update your php.ini file, and set the 'allow_url_fopen' option to 'On'.
If you are unable to change allow_url_fopen consider using use curl (if must be php) or Wget
go to the php.ini and enable that or use curl or one of the other http get functions. file_get_contents is considered a dangerous system call for urls because people can slip files into it as well.
Related
When creating a scheduled task to run a php script in Plesk Onyx on windows it results in an error.
However when i run the same script in the browser it works without any issues.
I have been looking for the permission settings in the webroot and set them to allow access to all user groups on the server.
The error i get is the following:
Warning: require(\pcp2\inc\db_config.php): failed to open stream: No such file or directory in D:\www\domain\pcp2\conversion\addBooking.php on line 5
Fatal error: require(): Failed opening required '\pcp2\inc\db_config.php' (include_path='.;.\includes;.\pear') in D:\www\domain\pcp2\conversion\addBooking.php on line 5
Line 5 contains the following info:
require($_SERVER['DOCUMENT_ROOT']."\pcp2\inc\db_config.php");
It's failing becasue $_SERVER['DOCUMENT_ROOT'] is a value provided by the web server, and is thus undefined when run without a web server (i.e., from the command line.) You'll need to provide an alternative mechanism to set the base directory.
You might use relative paths:
require("pcp2\inc\db_config.php");
Or absolute paths based on the magic constant __DIR__. (This assumes the script doing the require'ing is in the document root directory.)
require(__DIR__."\pcp2\inc\db_config.php");
Ideally however, you're better off using PSR-4 namespacing with an autoloader.
I have recently set up XAMPP. The setup was straightforward but I don't seem to know the correct tweak to allow it to speak to the outside web world.
No matter how I try to read an external URL, it tells me where to get off.
(In PHP) I've tried the simple file_get_contents route. When that failed, somebody pointed me to curl. I enabled that in php.ini but that failed too.
I get the very unhelpful "Unable to open file"
Fatal error: Unable to open "https://earth.esa.int/documents/10174/1514862/Swarm_Level-2_TEC_Product_Description" in C:\xampp\htdocs\includes\PdfToText.phpclass:1665 Stack trace: #0 #2 {main} thrown in C:\xampp\htdocs\includes\PdfToText.phpclass on line 1665
I know this seems like an error with the class PdfToText (above) but it's just a way of showing the error. Take the file from between the quote marks, try to load it and it's fine. It doesn't want to open files from the outside world no matter what's tried.
I assume that it's a port/proxy/something but I've Googled all day in and out of stackoverflow and I cannot see the same problem anywhere.
Quick fix:
Find your php.ini file:
php -i | grep "Loaded Configuration File"
look for allow_url_fopen and set it to On
allow_url_fopen = On
Explanation:
This error happens because when you use functions like file_get_contents or fopen, you are handling file pointers, what allow_url_fopen will let you do is, resolve the url, create a tcp connection and create a network file pointer to that tcp connection, which will be handled by php as it were a file.
For security reasons, this setting is disabled on some installations.
Check the docs:
http://php.net/manual/en/filesystem.configuration.php
I'm running a php site on localhost and i'm getting the following error.
Warning: file_get_contents(http://www.engadget.com/rss.xml)
[function.file-get-contents]: failed to open stream: HTTP request
failed! in C:\Program Files\xampp\htdocs\infohut\rss_read_class.php on
line 26
but when i run the same hosted in a real server, it doesn't give any error.
Also note that my local pc is connecting through a proxy and when i tried from a different pc with a direct connection to internet, the problem is not there.
So my guess is it has something to do with the proxy.
I'm using xampp for windows installation with php 5.1.4 and Apache 2.2.2
I tried adding my proxy settings to php.ini as well using below
pfpro.proxyaddress
pfpro.proxyport
still couldn't figure is out. Please advise if i need to change any other settings or anything.
Thanks
You should enable allow_url_fopen in your php.ini
I am using file_get_contents on my PHP and it throws some errors:
My code
#try to fetch from remote
$this->remotePath = "http://some-hostname.com/blah/blah.xml
$fileIn = #file_get_contents($this->remotePath);
The errors:
Warning: file_get_contents() [function.file-get-contents]: URL file-access is disabled in the server configuration in /virtual/path/to/file/outputFile.php on line 127
Warning: file_get_contents(https://some-host-name/data/inputFile.xml) [function.file-get-contents]: failed to open stream: no suitable wrapper could be found in /virtual/path/to/file/outputFile.php on line 127
Any idea? It worked fine in my computer but stopped working when I ported it to the web server.
Your server must have the allow_url_fopen property set to true. Being on a free webhost explains it, as it's usually disabled to prevent abuse. If you paid for your hosting, get in contact with your host so they can enable it for you.
If changing that setting is not an option, then have a look at the cURL library.
It seems "allow_url_fopen" setting is false on your server and hence does not allow using URLs with file_get_contents().
Try using CURL instead that is a better and efficient way of communicating with other server.
I am working on a script that fetches csv files from a web server. I am using file_get_contents presently. Sometimes i get the message
Warning: file_get_contents failed to open stream: Connection timed out
I assume it can be due to website being down. Or can there be a situation where the website is fine but still this warning shows up. Also what advantage does CURL provide over this function.
this is because the remote url is having 404 error.
For accessing remote files, you should use cURL. You can set cURL to timeout quietly if the remote server takes too long.