I'm desperately trying to solve this one. I have a bunch of files stored outside of the webroot and I need to serve them to a user after a few auth checks. These files have been uploaded using a Flex application or have just been manually uploaded through FTP. I have a serving script that looks something like:
<?php
$filePath = '/for/demonstration/only.jpg';
...
$type = exif_imagetype($filePath);
$size = filesize($filePath);
if ($type && $size > 0) {
switch($type)
{
case IMAGETYPE_PNG:
header("Content-Type: image/png");
break;
case IMAGETYPE_JPEG:
header("Content-Type: image/jpeg");
break;
default:
header("Content-Type: text/plain");
break;
}
header("Content-Length: {$size}");
readfile($filePath);
exit;
} else {
echo 'error';
}
Pretty simple. The image however, somewhere in the upload process, Because of the encoding process, the file has gained an extra 100-130B, and now seems to be corrupted. I get the extraneous bytes error. The upload script is pretty simple as well, Flex uses FileRefrence for the user to select the file, then encodes the data and sends it to the server script:
<?php
function fileupload($data)
{
$daily_folder = 'today/';
$fileName_clipped = substr( $fileName, 0, $max_file_len );
$fileName_clipped = preg_replace('/\./','_',$fileName_clipped);
$filePath = '/path/to/storage' . $daily_folder;
if(!is_dir($filePath))
mkdir($filePath);
if( strlen($data->filedata) > 0 ) {
if( !file_put_contents($filePath . $fileName_clipped, base64_decode($data->filedata)) )
return false;
} else {
return false;
}
}
Running the process
file A: 31,740B in, 31,848B out, 108B extra
file B: 35,273B in, 31,403B out, 130B extra
I imagine this could be on the Flash side, but honestly it's dirt simple. I just don't see where the extra data is coming in, and why its corrupting the file. Anyone know why this is happening? or better yet, how I can clean these files up now?
Here's the dealio. When I backed up all the images from one server and moved them to another I FTPed all the files onto my Windows laptop. That process, whether because of Windows or perhaps FileZilla, corrupted all of the files. adding a bunch of junk onto them. I'd live input as to what you think caused the problem, but regardless I located the problem and fixed it.
The solution was to zip the parent directory on the server and download the zip. I didn't have to modify any code. Just a simple procedure revision. How lame.
Related
I am working on a project where on the client-side it should start a download from a remote URL.
I currently got it to work with cURL, but the problem is that it downloads to the server, and not on the client's browser (asking where to save it).
The URL looks like this:
http://r13---sn-5hn7snee.googlevideo.com/videoplayback?expire=138181&mv=u&ipbits=0&clen=7511717&fexp=929447,913430,930802,916612,902546,937417,913434,936916,934022,936921,936923&ms=au&mt=1396164&itag=247&key=yt5&source=youtube&ip=2a00:7143:100:1:225:90ff:fe52:b3ad&sparams=clen,dur,gir,id,ip,ipbits,itag,lmt,source,upn,expire&signature=DBDCE9EFFC094C0FC653610C8FAAF367EA942CC4C6EF3971A186C&upn=0JQJ6yqyYPo&sver=3&gir=yes&lmt=13960601932&dur=130.133&id=17f0fd8b&size=1280x720,type=video/mp4
When downloading this just from the URL the filename will be 'videoplayback' without any file extension. So I need to have a custom file name and .mp4 as file type. (Which works on the cURL code below), but it writes to server instead of client-side download.
function download($file_source, $file_target) {
$rh = fopen($file_source, 'rb');
$wh = fopen($file_target, 'w+b');
if (!$rh || !$wh) {
return false;
}
while (!feof($rh)) {
if (fwrite($wh, fread($rh, 4096)) === FALSE) {
return false;
}
echo ' ';
flush();
}
fclose($rh);
fclose($wh);
return true;
}
$result = download('http://r13---sn-5hn7snee.googlevideo.com/videoplayback?expire=138181&mv=u&ipbits=0&clen=7511717&fexp=929447,913430,930802,916612,902546,937417,913434,936916,934022,936921,936923&ms=au&mt=1396164&itag=247&key=yt5&source=youtube&ip=2a00:7143:100:1:225:90ff:fe52:b3ad&sparams=clen,dur,gir,id,ip,ipbits,itag,lmt,source,upn,expire&signature=DBDCE9EFFC094C0FC653610C8FAAF367EA942CC4C6EF3971A186C&upn=0JQJ6yqyYPo&sver=3&gir=yes&lmt=13960601932&dur=130.133&id=17f0fd8b&size=1280x720,type=video/mp4');
if (!$result)
throw new Exception('Download error...');
So my question is how I can turn this into so it renames the file and changes the extension to mp4, and then downloads the file on client-side.
Thanks
First output the correct headers to the client so it knows how to handle a file and then echo the file contents. So try this way:
<?php
// ... fetch the file using CURL and store it (in memory or on file system) ...
// Set the content type
header('Content-type: application/pdf');
// It will be called downloaded.pdf
header('Content-Disposition: attachment; filename="downloaded.pdf"');
// ... echo/output the file contents to the browser...
?>
Check out "Example #1 Download dialog" in the PHP documentation.
Also, you mention you are using CURL, I don't see that anywhere in your code.
I'm trying to use WordPress on the AppFog PaaS system. Unfortunately, AppFog doesn't have persistent storage so all content outside of the database needs to be stored on some external system (like S3). I'm successfully using a plugin which pushes all my WordPress media out to S3, but am having problems loading some of the images.
To investigate, I deployed the following script:
// get the image name from the query string
// and make sure it's not trying to probe your file system
if (isset($_GET['pic'])) {
$pic = $_GET['pic'];
// get the filename extension
$ext = substr($pic, -3);
// set the MIME type
switch ($ext) {
case 'jpg':
$mime = 'image/jpeg';
break;
case 'gif':
$mime = 'image/gif';
break;
case 'png':
$mime = 'image/png';
break;
default:
$mime = false;
}
// if a valid MIME type exists, display the image
// by sending appropriate headers and streaming the file
if ($mime) {
header('Content-type: '.$mime);
header('Content-length: '.filesize($pic));
$file = fopen($pic, 'rb');
if ($file) {
fpassthru($file);
exit;
}
}
}
?>
Which allows me to directly test my ability to read and write an image in PHP. This proxy script works perfectly for images under around 10KB -- i.e. when I open the script in a browser pointing it at some "small" image file in my S3 bucket, I'm able to see it.
However, when I attempt to load a "large" file (anything over 10KB), I get an error. In Firefox, that's:
The image “http://myssite.com/iproxy.php?pic=http://aws.amazon.com%2Fmybucket%2Fwp-content%2Fuploads%2F2013%2F01%2Fmylargeimage.png” cannot be displayed because it contains errors.
I've been wrestling with this for hours and can't seem to figure anything out. I've tried changing the output_buffering to a larger value but that hasn't helped.
Any tips would be appreciated!
After searching Google and SO, I found this little bit of code for creating thumbnails of PDF documents using ImageMagick.
The trouble for me is in implementing it into my WordPress theme. I think that I'm getting stuck on the path to cache that the script needs for temporary files.
I'm using it as described in the article:
<img src="http://localhost/multi/wp-content/themes/WPalchemy-theme/thumbPdf.php?pdf=http://localhost/multi/wp-content/uploads/2012/03/sample.pdf&size=200 />
which must be right (maybe... but I assume i am correct to use full URL to the actual file), because when I click on that URL I am taken to a page that reads the following error:
Unable to read the file: tmp/http://localhost/multi/wp-content/uploads/2012/03/sample.pdf.png
Now tmp is defined in the thumbPdf.php script, but I am confused as to what it's value should be. Is it a url or a path? Like timthumb.php, can i make it be relative to the thumbPdf.php script? (I tried ./cache which is the setting in timthumb -and was sure to have a /cache folder in my theme root, to no avail). also, fyi I put a /tmp folder in my root and still get the same error.
So how do I configure tmp to make this work?
http://stormwarestudios.com/articles/leverage-php-imagemagick-create-pdf-thumbnails/
function thumbPdf($pdf, $width)
{
try
{
$tmp = 'tmp';
$format = "png";
$source = $pdf.'[0]';
$dest = "$tmp/$pdf.$format";
if (!file_exists($dest))
{
$exec = "convert -scale $width $source $dest";
exec($exec);
}
$im = new Imagick($dest);
header("Content-Type:".$im->getFormat());
echo $im;
}
catch(Exception $e)
{
echo $e->getMessage();
}
}
$file = $_GET['pdf'];
$size = $_GET['size'];
if ($file && $size)
{
thumbPdf($file, $size);
}
I have seen this answer:
How do I convert a PDF document to a preview image in PHP?
and am about to go try it next
The error tells everything you need.
Unable to read the file: tmp/http://localhost/multi/wp-content/uploads/2012/03/sample.pdf.png
Script currently tries to read file from servers tmp/ folder.
$tmp = 'tmp';
$format = "png";
$source = $pdf.'[0]';
//$dest = "$tmp/$pdf.$format";
$dest = "$pdf.$format";
Remember securitywise this doesn't really look so good, someone could exploit ImageMagic bug to achieve very nasty things by giving your script malformed external source pdf. You should at least check if the image is from allowed source like request originates from the same host.
Best way to work with ImageMagic is to always save the generated image and only generate a new image if generated image doesn't exist. Some ImageMagic operations are quite heavy on large files so you don't want to burden your server.
I'm having a stump with some PHP...
I have a Flash Application that sends an image (using as3corelib) to a PHP script that previews it in the browser, which works! But, I would actually like it to permanently save it the a server folder (uploads, etc.) instead of temporarily saving it. I can't find the right variable in the PHP that actually sends the image to a server so it could save it.
<?php
switch ($_POST["format"]) {
case 'jpg':
header('Content-Type: image/jpeg');
break;
case 'png':
header('Content-Type: image/png');
break;
}
if ($_POST['action'] == 'prompt') {
header("Content-Disposition: attachment; filename=" . $_POST['fileName']);
}
echo base64_decode($_POST["image"]);
?>
Here's an example of it: http://shmoggo.com/snapshot
JPEG, Open to Browser (but I would like it to SAVE to browser)
Any PHP guru help would be terrific, thanks a lot!
Aaron
If you have the filename, you can simply do
$newpath = "/folders/image.jpg";
$data = file_get_contents($_POST['fileName']);
file_put_contents($newpath, $data);
Rather then displaying it, save $_POST['image'] to the server, see File System
I'm finding it difficult to phrase this question correctly, let me try to explain our problem...
We have an intranet running on Ubunutu box with Apache2/PHP 5.2.4. We have a bit of PHP code that reads a file from a directory that is not publically accessible and output it to the screen (code below):
$file_path = '/home/path/to/filename.gif';
if(file_exists($file_path)){
$output = FALSE;
//File Information
$path_parts = pathinfo($file_path);
$file_size = filesize($file_path);
$file_ext = (isset($path_parts['extension'])) ? strtolower($path_parts['extension']) : null;
$file_name = $path_parts['basename'];
//Sets up the headers
if($file_size > 0){
header('Content-Length: ' .$file_size);
}
header('Content-Disposition: attachment; filename="'.$file_name.'"');
header('Content-Type: application/octet-stream');
//Reads the File
if($file_size > 0){
$handle = fopen($file_path, "r");
$output = fread($handle, $file_size);
fclose($handle);
}
//Outputs the File
echo $output;
}
Inside our network when, browsing to the page that uses this code, the file is downloaded perfectly and quickly...
However, when accessing this page via our Cisco ASA/Proxy/VPN (not sure what to call it) this code locks up the browser, but does eventually download the file...
After a bit of experimenting, after taking out the headers and just echoing the contents of the file to the browser, it prints no problem. However as soon as I add the lines with the headers back into the code it causes the hanging again, but only when accessed via this box..
Anybody come across this problem before or have any idea what we can try to move forward?
Thanks for any advice...
Have you tried eliminating the content-size header entirely? The proxy may be taking that as a firm promise and if the data you're sending ends up being a different size, the proxy may wait for those last few "missing" bytes to show up.
Just as an aside, you should use [readfile()][1] instead of the fopen()/fread()/echo construct you have now.
As it stands now, you're slurping the contents of the entire file into memory and then echoing out. For large files and multiple requests, you'll kill the server with memory starvation. readfile will automatically stream the file in smaller chunks so that memory usage is minimal.
Your proxy obviously have problems with the Content-Type: application/octet-stream. Try setting it to the real MIME-type of each file. You can use the Fileinfo module to find out which MIME-type a certain file is, like this:
//You may need to specify the location of your system's magic file
//See http://php.net/finfo_open for more info
$finfo = new finfo(FILEINFO_MIME);
$mimetype = $finfo->file($file_path);