I wrote a php function to copy an internet image to local folder, sometimes it works well, but sometimes it just generate an invalid file with size of 1257B.
function copyImageToLocal($url, $id)
{
$ext=strrchr($url, ".");
$filename = 'images/' . $id . $ext;
ob_start();
readfile($url);
$img = ob_get_contents();
ob_end_clean();
$fp=#fopen($filename, "a");
fwrite($fp, $img);
fclose($fp);
}
Note: The $url passed in is valid, sometimes this function fails at first time, but may successful for the second or third time. It's really strange...
Does this require some special PHP settings?
Please help me!
I have found the real reason: the test image url is not allowed for program accessing, though it can be opened in browser.
I tried some other image url, the function works well.
So, it seems I need find a way to process this kind of cases.
Thanks guys!
Why don't you just open the file and write it to the disk like so:
file_put_contents($filename, fopen($url, 'r'));
This will even do buffering for you so you shouldn't run into memory problems (since you are storing the whole image in memory before writing it to a file)
Related
So I came at a little problem with a project of mine. We have a bulky server with lots of space as well as a light static storage server that can only be used to store things. We need to make sure only the people who are authenticated can access the resources on the static server, so I thought about making a psuedo-proxy out of readfile(), as we can use allow_url_fopen.
So I tried the following code as a test:
<?php
$type = "video/webm";
$loc = "http://a.pomf.se/fzggfj.webm";
header('Content-Type: '.$type);
header('Content-Length: '.filesize($loc));
readfile($loc);
exit;
This always fails, the browser reads this as corrupted. Interestingly, when you do this:
<?php
$type = "video/webm";
$loc = "../test.webm";
header('Content-Type: '.$type);
header('Content-Length: '.filesize($loc));
readfile($loc);
exit;
It does work, even though the file is the exact same. Does anyone know why readfile will not do this correctly, and explain this to me?
EDIT:
I got the error message from it, it was stuck in the file.
Warning: filesize(): stat failed for http://a.pomf.se/fzggfj.webm in C:\uniform\UniServerZ\www\director.php on line 5
Is filesize() my problem here?
Ok I fixed it. deceze was correct, and filesize was the issue. Let the record show that filesize doesn't work on remote resources I guess.
You need to activate allow_url_fopen by adding allow_url_fopen=1 in your php.ini.
why You aren't downloading video to temporary directory and redirect user to there? (of course You can clear outdated tmp dir later with cron script)
try this:
<?php
$loc = "http://a.pomf.se/fzggfj.webm";
$pathToVideos = dirname(__FILE__).'/tmp/';
$ext = explode('.', $loc);
$ext = end($ext);
$hash = md5($loc);
$filename = $hash.'.'.$ext;
$tmpFile = $pathToVideos.$filename;
if(!is_file($tmpFile)) {
exec('wget -O '.escapeshellarg($tmpFile).' '.escapeshellarg($loc));
}
header('Location: /tmp/'.$filename);
exit(0);
I want to store some data retrieved using an API on my server. Specifically, these are .mp3 files of (free) learning tracks. I'm running into a problem though. The mp3 link returned from the request isn't to a straight .mp3 file, but rather makes an ADDITIONAL API call which normally would prompt you to download the mp3 file.
file_put_contents doesn't seem to like that. The mp3 file is empty.
Here's the code:
$id = $_POST['cid'];
$title = $_POST['title'];
if (!file_exists("tags/".$id."_".$title))
{
mkdir("tags/".$id."_".$title);
}
else
echo "Dir already exists";
file_put_contents("tags/{$id}_{$title}/all.mp3", fopen($_POST['all'], 'r'));
And here is an example of the second API I mentioned earlier:
http://www.barbershoptags.com/dbaction.php?action=DownloadFile&dbase=tags&id=31&fldname=AllParts
Is there some way to bypass this intermediate step? If there's no way to access the direct URL of the mp3, is there a way to redirect the file download prompt to my server?
Thank you in advance for your help!
EDIT
Here is the current snippet. I should be echoing something, correct?
$handle = fopen("http://www.barbershoptags.com/dbaction.php?action=DownloadFile&dbase=tags&id=31&fldname=AllParts", 'rb');
$contents = stream_get_contents($handle);
echo $contents;
Because this echos nothing.
SOLUTION
Ok, I guess file_get_contents is supposed to handle redirects just fine, but this wasn't happening. So I found this function: https://stackoverflow.com/a/4102293/2723783 to return the final redirect of the API. I plugged that URL into file_get_contents and volia!
You seem to be just opening the file handler and not getting the contents using fread() or another similar function:
http://www.php.net/manual/en/function.fread.php
$handle = fopen($_POST['all'], 'rb')
file_put_contents("tags/{$id}_{$title}/all.mp3", stream_get_contents($handle));
I'm writing a function in php, client side I have a canvas image which I use toDataUrl() along with a file name to save the image on the server. The here's the code:
<?php
$imageData=$GLOBALS['HTTP_RAW_POST_DATA'];
$data = json_decode($imageData, true);
$file = $data["file"];
$image = $data["data"];
$filteredData=substr($image, strpos($image, ",")+1);
$unencodedData=base64_decode($filteredData);
$fp = fopen( 'image/' . $file , 'wb' );
fwrite( $fp, $unencodedData);
fclose( $fp );
?>
The thing is that this code works. And for two out of three of the pages I used it on it works fine. The problem is when I copy and pasted it a third time to implement it again, for some reason the file is made on the server except that no data get's written into the file. I don't think it's a problem client side because I write in a debug alert message in the javascript and a debug echo into the PHP and both are able to print out the data fine. I made this short debug file:
<?php
$fp = fopen('data.txt', 'wb');
if(is_writable('data.txt')){
echo "file is writable<br>";
}
if(fwrite($fp, 'test') == FALSE){
echo "failed to write data<br>";
}
fclose($fp);
?>
And the output is
file is writable
failed to write data
I've tried using chmod and setting everything, the folder, the text file before I write to it to 0777 and I still get the same result; the file is made but no data is written into it. Is there anything I'm missing or any other approaches that might help. I haven't found anything on google and am still baffled as to why the same code worked exactly as expected twice before suddenly stopping for no apparent reason.
Thanks in advance.
I know this is an old post, but I had a very similar problem and found a solution (for me at least)! I ran out of disk space on my server, so it could create a 0 byte file, but wouldn't write to it. After I cleared out some space (deleted a 13gb error.log file) everything started working again as expected.
If fopen works but fwrite mysteriously doesn't, check your disk space. 'df -h' is the command to check disk space on a linux server.
instead of $fp = fopen('data.txt', 'wb'); give $fp = fopen('data.txt', 'w'); and try
Changed "wb" to "w"
When you write $fp = fopen('data.txt', 'w'); for your domain website.com having root at /var/www/website/ and if the php file is located at /var/www/website/php/server/file/admin.php or something similar, it will actually create a file at /var/www/website/data.txt
Try giving absolute path or path relative to your domain root to create files like,
$fp = fopen('php/server/file/data.txt', 'w');
Try the find command to see if the file is created anywhere else in the folder directory by using the following in Ubuntu,
find /var/www/website/ -name 'data.txt'
I had this issue, probably can help you solve if you have similar issue.
So, I'm writing a chunked file transfer script that is intended to copy files--small and large--to a remote server. It almost works fantastically (and did with a 26 byte file I tested, haha) but when I start to do larger files, I notice it isn't quite working. For example, I uploaded a 96,489,231 byte file, but the final file was 95,504,152 bytes. I tested it with a 928,670,754 byte file, and the copied file only had 927,902,792 bytes.
Has anyone else ever experienced this? I'm guessing feof() may be doing something wonky, but I have no idea how to replace it, or test that. I commented the code, for your convenience. :)
<?php
// FTP credentials
$server = CENSORED;
$username = CENSORED;
$password = CENSORED;
// Destination file (where the copied file should go)
$destination = "ftp://$username:$password#$server/ftp/final.mp4";
// The file on my server that we're copying (in chunks) to $destination.
$read = 'grr.mp4';
// If the file we're trying to copy exists...
if (file_exists($read))
{
// Set a chunk size
$chunk_size = 4194304;
// For reading through the file we want to copy to the FTP server.
$read_handle = fopen($read, 'rb');
// For appending to the destination file.
$destination_handle = fopen($destination, 'ab');
echo '<span style="font-size:20px;">';
echo 'Uploading.....';
// Loop through $read until we reach the end of the file.
while (!feof($read_handle))
{
// So Rackspace doesn't think nothing's happening.
echo PHP_EOL;
flush();
// Read a chunk of the file we're copying.
$chunk = fread($read_handle, $chunk_size);
// Write the chunk to the destination file.
fwrite($destination_handle, $chunk);
sleep(1);
}
echo 'Done!';
echo '</span>';
}
fclose($read_handle);
fclose($destination_handle);
?>
EDIT
I (may have) confirmed that the script is dying at the end somehow, and not corrupting the files. I created a simple file with each line corresponding to the line number, up to 10000, then ran my script. It stopped at line 6253. However, the script is still returning "Done!" at the end, so I can't imagine it's a timeout issue. Strange!
EDIT 2
I have confirmed that the problem exists somewhere in fwrite(). By echoing $chunk inside the loop, the complete file is returned without fail. However, the written file still does not match.
EDIT 3
It appears to work if I add sleep(1) immediately after the fwrite(). However, that makes the script take a million years to run. Is it possible that PHP's append has some inherent flaw?
EDIT 4
Alright, further isolated the problem to being an FTP problem, somehow. When I run this file copy locally, it works fine. However, when I use the file transfer protocol (line 9) the bytes are missing. This is occurring despite the binary flags the two cases of fopen(). What could possibly be causing this?
EDIT 5
I found a fix. The modified code is above--I'll post an answer on my own as soon as I'm able.
I found a fix, though I'm not sure exactly why it works. Simply sleeping after writing each chunk fixes the problem. I upped the chunk size quite a bit to speed things up. Though this is an arguably bad solution, it should work for my uses. Thanks anyway, guys!
I am working on a function for a PostgreSQL database, that when client issues a database dump, the dump is offered as a download. This snapshot could then later be used to restore the database with. However, I can't seem to figure out how to do it. When the user presses the button, an AJAX call to the server is made, as to which the server executes the following code:
if($_POST['command'] == 'dump'){
$dump = $table->Dump();
header("Content-type: application/octet-stream");
header('Content-Disposition: attachment; filename=/"'.$dump.'/"');
}
Where $table->Dump() looks like this:
public function Dump(){
$filename = dirname(__FILE__)."/db_Dump.out";
exec("pg_dump ".$this->name." > $filename");
return $filename;
}
The dump isn't made though. Any tips on this?
This approach however, doesn't work. I thought that setting the headers would be enough to cause a download, but apparently I was wrong. So what would be the correct way of creating a download?
Edit 1, #stevevls:
if($_POST['command'] == 'dump'){
$dump = $table->Dump();
$fh = fopen($dump, 'r') or die("Can't open file");
header("Content-type: application/octet-stream");
header('Content-Disposition: attachment; filename=/"'.$dump.'/"');
$dumpData = fread($fh, filesize($fh));
fclose($fh);
echo $dumpData;
}
I still don't get anything as a download though.
Edit 2, #myself
I have been able to get a return value, it seemed that the the check if the command given was 'dump' was never reached. I fixed that, and now I get an error on the pg_dump command. I now get
sh: cannot create ../database/db_Dump.sql: Permission denied
I bet this is due to php not being allowed to run pg_dump, but how could I get the system to allow it to be able to run it?
Edit 3, #myself
After resolving the issue with the pg__dump (I added www-data, Apaches user on my system, to the sudoers list, which resoved the issue. Also setting the correct permissions on the directory to write to is handy aswell.) I now get the db_Dump.sql as plain text instead of a save as dialog. Any ideas on that?
first of all check if dump file was created on disc.
Second, check if your PHP script has not reached time limit, because making dump can last long.
Third, you want to read whole dump into memory? You can easly reach memory limit, so do it part-by-part. On php.net you have example in fread manual:
$handle = fopen("http://www.example.com/", "rb");
$contents = '';
while (!feof($handle)) {
$contents .= fread($handle, 8192);
}
fclose($handle);
Turns out, it was all due to the fact of how I requested the download. It seems that it is impossible to get a download when you request it via Ajax, as the returned file get's accepted in the success method of the call. After Changing this to a direct link to the file, I was able to get a download.