I have a simple php script that runs on an IIS 6 2003 server. All the script does is try to open a file on a shared network network location and display its content to the screen. Try as I might I cannot get that to work. I keep getting a "failed to open stream error message". After a lot of reading (I'm on my second week of working at this) I narrow the problem down to being a server configuration problem. I can run the script through the command prompt and it works fine. If I run var_dump(shell_exec('whoami')) It returns "NULL". If I run that same command on the command prompt it returns the current user that is logged in (i.e. me). The task manager reports that the user for w3wp.exe is "NETWORK SERVICE". I'm including the code below although I'm 100% sure the code is not the problem but, some people like to look at code so there it is. How do you configure or make changes on the server so that it allows reading from a network location? Also, the network location I'm trying to access have been setup with all permissions for everybody so that we can solve this one issue.
<?php
$theFile="\\\\192.168.0.16\\geo\\junk.txt"; #network file does not works
#$theFile="junk.txt"; #local file works fine
$handle = fopen($theFile, "r");
if($handle){
while (!feof($handle)){
$buffer = fgets($handle);
echo $buffer."<br />";
}
}
?>
Related
I feel like this should be a pretty straightforward process.
I have the following code:
<?php
$filename = "c:/TestFolder/config.txt";
echo "Attempting to read: ".$filename."<br/>";
$fh = fopen($filename, 'r') or die("file doesnt exist");
$readtext = fread($fh, filesize($filename));
fclose($fh);
echo "The Text is: ".$readtext;
?>
I have checked that I do indeed have "config.txt" in a folder called "TestFolder" on my C:/ drive... but I keep getting an error that the file doesn't exist.
I checked my PHPInfo to ensure that "allow_url_fopen" is turned on.
I have also tried different file path variations, such as:
C:\\TestFolder\\config.txt
C:\/TestFolder\/config.txt
This doesn't seem to make a difference.
Any idea what might be preventing me from opening this file?
Edits:
It should be noted that this file is uploaded to my web host, while I am attempting to access a file on my local machine. Not sure if this changes things at all.
This is not possible. "local files" are files on the server where PHP is running, not the client running the web browser. While they might be the same machine when you're testing locally, once you upload the script to a web host, PHP tries to read files on the web host, not your machine.
The only way for PHP to access a file on the client machine is for the application to upload it.
I've been battling this problem for days now. I'm 100% sure is a user configuration thing somewhere but, i don't know where to look or what to change. Here is the problem. I have a file i want to read. It just contains 5 lines like "this is line 1, this is line 2 etc. It's for debugging purposes. I have 2 copies of this files. one lives locally and one lives on a network drive. I can access the file locally no problem. I can also access the file on the network drive if I specify the address as in \\192.168.0.16\geo\junk.txt. What I cannot do is to access the file via a mapped drive. as in U:\junk.txt where the U: is mapped to the 192.168... above. Below is my code. Again, after A LOT!!! of reading I've come to the conclusion that its a user perminsion thing between the machine the code lives in and the apache server that runs the code. I don't think the two are talking to each other. Just in case this is on a windows 7 machine apache 2.2.
<?php
#$theFile="\\\\192.168.0.16\\geo\\junk.txt"; #works fine
#$theFile="junk.txt"; #local file works fine
$theFile="U:\\junk.txt"; #mapped drive DOESN'T WORK AAARRRGGG!!!!!
$handle = fopen($theFile, "r");
if($handle){
while (!feof($handle)){
$buffer = fgets($handle);
echo $buffer."<br />";
}
}
?>
I'm having a strange problem using ftp_get() on one of the two identical instances. One is on localhost and another on an actual server. I'm using the following to download a file via FTP. Both of the instances download from the same FTP servers with the same credentials and same paths.
$result = ftp_get($connection, $downloadPath, $serverPath, FTP_BINARY);
if ($result) {
$successfulWrites[] = $downloadPath; // file name only without path
} else {
// on second attempt to download file with same name, ftp_get() returns false
// this is where I throw an exception in my code
}
On my localhost, I can download the same file over and over, and it doesn't matter what the file name on the FTP server is or where it's located.
On second instance, which is identical to the localhost's (i.e. pulled from the same git repo) in terms of code, I can download a file once, but the same file cannot be downloaded again, and ftp_get() returns false. If I change the name of the file on the FTP server, I can download it, but after that it won't work again. i.e. ftp_get() will return false.
I don't have access to the FTP server log. If it's available, I'm going to try to get it today from the host. But can anyone think of a reason this might be happening? ftp_get() just returns true or false without any explanation, so I'm pretty stuck with this.
I'm using PHP 5.4, and I have no idea what the spec is of the FTP (regular FTP) server.
As discussed, it sounded like ftp_get was successfully obtaining the file and writing it locally. I wonder whether due to a permissions problem, when it tries to write the file locally again, it fails. Thus, the FTP channel itself is fine, and the problem is just local.
I'm somewhat surprised at this though, as I would imagine PHP would have raised a warning. Is your error_reporting set to allow this whilst you are debugging?
I'm trying to develop an online management system for a very large FLAC music library for a radio station. It's got a beefy server and not many users, so I want to be able to offer a file download service where PHP transcodes the FLAC files into MP3/WAV depending on what the endpoint wants.
This works fine:
if($filetype == "wav") {
header("Content-Length: ". $bitrate * $audio->get_length());
$command = "flac -c -d ".$audio->get_filename().".flac";
}
ob_end_flush();
$handle = popen($command, "r");
while($read = fread($handle, 8192)) echo $read;
pclose($handle);
and allows the server to start sending the file to the user before the transcoding (well, decoding in this case) completes, for maximum speed.
However, the problem I'm getting is that while this script is executing, I can't get Apache to handle any other requests on the entire domain. It'll still work fine on other VirtualHosts on the same machine, but nobody can load any pages on this website while one person happens to be downloading a file.
I've also tried implementing the same thing using proc_open with no difference, and have played with the Apache settings for number of workers and the like.
Is the only way to stop this behaviour to use something like exec and waiting for the encoding process to finish before I start sending the user the file? Because that seems sub-optimal! :(
UPDATE: it seems that other people can still access the website, but not me - i.e. it's somehow related to sessions. This confuses me even more!
Use session_write_close() at some point before you start streaming... You may also want to stream_set_blocking(false) on the read pipe.
This has been bugging me for literally hours already.
I can't seem to figure out why PHP cURL won't download data to a file. (CURLOPT_FILE is set to a local file.) I am not getting any data. I periodically check the file size of the destination file and it is always zero. To give you a background, I am downloading a 90kb jpeg file (for testing purposes).
This is working on my local computer (XP) but not in the website I am working on (Windows Server 2003).
I did several tests which made the scenario even weirder.
I disabled CURLOPT_FILE to print the data returned by curl into standard output, and the binary data printed.
Having experienced blocked websites before (since the server implements access control), I tried accessing the file from internet explorer and i was able to see it.
Having experienced blocked downloads before, I tried downloading the file from internet explorer and it was downloaded.
The file is created by fopen('', 'w') but the size remains 0. Despite this successful file creation, I thought maybe PHP has a problem with filesystem write privileges, I set the exe to be run even by non-admin users. Still no download.
Has this ever occured to anybody?
Any pointers will be appreciated. I am really stuck.
Thank you.
Here's the curl options I set:
$connection = curl_init($src);
// If these are not set, curl_exec outputs data.
// If these are set, curl_exec does not send any data to the file
// pointed to by $file_handler. $file_handler is not null
// because it is opened as write (non-existing file is created)
curl_setopt($connection, CURLOPT_RETURNTRANSFER, true);
curl_setopt( $connection, CURLOPT_FILE, $file_handler );
PS: I'm doing these tests using the command line and not the browser.
you might not have permissions to write to the file.
I don't think you have to set CURLOPT_RETURNTRANSFER and if you are running from the command line be sure to run the php with admin rights. Not sure how it works in windows but in linux I always sudo every command line script I run.
Also if php safe mode is on be sure to also give the the directory the same (UID) owner as the php file. Hmh but since you can create the file (with 0 filesize) it might have nothing to do with rights... could you check the *open_basedir* php setting on your server? If it is set cUrl is not allowed to use file protocol... did you check the log files from your server? maybe there is an error.
You may need to figure out what user runs your php, if the user running the php script (the one that calls php ) is not authorized to write to the directory of the file, or to the /path/to/file , you may need to adjust your file permissions.