PHP/Perl filesize function not working on new server - php

I have a problem with a function that doesn't work as expected since I have moved my site from a shared hosting to a VPS (both have the same Linux OS, php version 5.2.9 and Perl version 5.8.8).
When my script store a remote file into a local directory, I run a simple php script at regular intervals (5 seconds) using XMLHttpRequest, this php script execute a Perl script that return the current file size (bytes already downloaded).
Here is the php code:
<?php
if (isset($_GET['file'])) {
clearstatcache();
$file = $_GET['file'];
exec("/usr/bin/perl /home/xxxxxx/public_html/cgi-bin/filesize.pl $file", $output);
//print_r($output);
if (!empty($output) || $output[0] != "") {
$currentSize = $output[0];
file_put_contents('progress.txt', $currentSize);
} else {
...
...
}
}
?>
Here is the Perl code
#!/usr/bin/perl
$filename = $ARGV[0];
$filepath = '/home/xxxxxx/public_html/tmp_dir/'.$filename.'.flv';
$filesize = -s $filepath;
print $filesize;
When I was running these scripts on the shared server, I had no problem and could see the download progress, but now, the file size is only printed when the remote file has been fully downloaded and I can't see the progress.
I think I need to change something in the php settings but I'm not sure and I don't know what needs to be changed.
OK, I'm sorry/stupid, the filesize() function works fine, thank you all guys.

If you need the file size, you could also just call the filesize function from PHP, and avoid having to use perl altogether.

The problem is probably caused by a different file location. Are you positive that the file '/home/xxxxxx/public_html/tmp_dir/'.$filename.'.flv' exists? You could test it with:
if (-e '/home/xxxxxx/public_html/tmp_dir/'.$filename.'.flv')
Remember that you could use PHP filesize() instead:
<?php
if (isset($_GET['file'])) {
clearstatcache();
$file = $_GET['file'];
if (file_exists("/home/xxxxxx/public_html/tmp_dir/$file.flv") {
$currentSize = filesize("/home/xxxxxx/public_html/tmp_dir/$file.flv");
file_put_contents('progress.txt', $currentSize);
} else {
...
...
}
}
?>

Related

"No such file or directory" on localhost copy

EDIT: I'm pretty sure the issue has to do with the firewall, which I can't access. Marking Canis' answer as correct and I will figure something else out, possibly wget or just manually scraping the files and hoping no major updates are needed.
EDIT: Here's the latest version of the builder and here's the output. The build directory has the proper structure and most of the files, but only their name and extension - no data inside them.
I am coding a php script that searches the local directory for files, then scrapes my localhost (xampp) for the same files to copy into a build folder (the goal is to build php on the localhost and then put it on a server as html).
Unfortunately I am getting the error: Warning: copy(https:\\localhost\intranet\builder.php): failed to open stream: No such file or directory in C:\xampp\htdocs\intranet\builder.php on line 73.
That's one example - every file in the local directory is spitting the same error back. The source addresses are correct (I can get to the file on localhost from the address in the error log) and the local directory is properly constructed - just moving the files into it doesn't work. The full code is here, the most relevant section is:
// output build files
foreach($paths as $path)
{
echo "<br>";
$path = str_replace($localroot, "", $path);
$source = $hosted . $path;
$dest = $localbuild . $path;
if (is_dir_path($dest))
{
mkdir($dest, 0755, true);
echo "Make folder $source at $dest. <br>";
}
else
{
copy($source, $dest);
echo "Copy $source to $dest. <br>";
}
}
You are trying to use URLs to travers local filesystem directories. URLs are only for webserver to understand web requests.
You will have more luck if you change this:
copy(https:\\localhost\intranet\builder.php)
to this:
copy(C:\xampp\htdocs\intranet\builder.php)
EDIT
Based on your additional info in the comments I understand that you need to generate static HTML-files for hosting on a static only webserver. This is not an issue of copying files really. It's accessing the HMTL that the script generates when run through a webserver.
You can do this in a few different ways actually. I'm not sure exactly how the generator script works, but it seems like that script is trying to copy the supposed output from loads of PHP-files.
To get the generated content from a PHP-file you can either use the command line php command to execute the script like so c:\some\path>php some_php_file.php > my_html_file.html, or use the power of the webserver to do it for you:
<?php
$hosted = "https://localhost/intranet/"; <--- UPDATED
foreach($paths as $path)
{
echo "<br>";
$path = str_replace($localroot, "", $path);
$path = str_replace("\\","/",$path); <--- ADDED
$source = $hosted . $path;
$dest = $localbuild . $path;
if (is_dir_path($dest))
{
mkdir($dest, 0755, true);
echo "Make folder $source at $dest. <br>";
}
else
{
$content = file_get_contents(urlencode($source));
file_put_contents(str_replace(".php", ".html", $dest), $content);
echo "Copy $source to $dest. <br>";
}
}
In the code above I use file_get_contents() to read the html from the URL you are using https://..., which in this case, unlike with copy(), will call up the webserver, triggering the PHP engine to produce the output.
Then I write the pure HTML to a file in the $dest folder, replacing the .php with .htmlin the filename.
EDIT
Added and revised the code a bit above.

How can I check uploaded files extension?

How can I check the uploaded files extension in the following code(I already wrote a file type checking)? I want to prevent uploading image files with wrong extension, like *.jpg.exe.
My code:
<?php
class Uploader {
private $fileName;
private $fileData;
private $destination;
public function __construct($key){
$this->fileName = $_FILES[$key]['name'];
$this->fileData = $_FILES[$key]['tmp_name'];
}
public function saveIn($folder){
$this->destination = $folder;
}
public function save(){
$folderWriteAble = is_writable($this->destination);
if($folderWriteAble && (exif_imagetype($this->fileData) == IMAGETYPE_JPEG)){
$name = "$this->destination/$this->fileName";
$success = move_uploaded_file($this->fileData, $name);
} else {
trigger_error("cannot write to $this->destination");
$success = false;
}
return $success;
}
}
If you run on your server(s) linux I would check the file content type with the command file that returns the real mime type of the file. Than you can be sure what that content is (in most cases).
That programm uses that magic bytes. The orginal idea is to check the first view bytes and check if a file contains a known pattern, e.g. "MZ" for windows executables or "‰PNG" for png files. However that file programm does also some more things than only the basic set of the first view bytes.
Depending on the comments, you are concerned about wrong, e.g. double file extensions. I would say don't think about it and just rename that file, in best case with some random name. That could be also helpful if you worry about that somebody just counts up some file numbers to see unpublished images.
I think you already do this on (exif_imagetype($this->fileData) == IMAGETYPE_JPEG), but there's a really good discussion on this here: https://security.stackexchange.com/questions/57856/is-there-a-way-to-check-the-filetype-of-a-file-uploaded-using-php
Use getimagesize which checks the first three bits in the file. Note that $_FILES isn't secure as it reads the extension (which people can change of course), vs getimagesize which reads permission bits.
Usage:
$image = getimagesize($_FILES['image']['tmp_name']);
$filetype = $image['mime'];
Hope this helps
I know this won't necessarily answer your specific question, but a good way to prevent "PHP images" to be "executed" is to have images served from a place that doesn't execute PHP scripts and only serves static images (ie: nginx, if properly configured).
It could even be an external CDN or just a simple directory that doesn't run php.
That being said, you can also try:
1- Make sure file type is (jpg, gif or png)
2- Make sure dimensions are numbers
3- Make sure file size does not exceed allowed size
4- Make sure file is not executable by anyone else (proper chmod settings are important in shared environment).
5- Rename and convert all uploads through imagemagick to jpg (or your desired format)
Use GD library to test if your upload is a jpg and in addition, check if it also returns false for partially uploaded images:
$image = #imagecreatefromjpeg($this->fileData);
if(!$image) { imagedestroy($image); return false; } // file is not a jpg
else { imagedestroy($image); return true; } // file is a jpg
If you can use exec(), you may also invoke the unix file utility for checking the bynary signatures.
// verify that the file is a jpg
$mime = "image/jpeg; charset=binary";
exec("file -bi " . $this->fileData, $out);
if ($out[0] != $mime) {
// file is not a jpg
...
If you have ClamAV installed you can also check for virus with the exec command:
exec("clamscan --stdout " . $this->fileData, $out, $return);
if ($return) {
// file is infected
...

Upload external image to webhost via php

I want to upload an external image to my remote host via php.mean's that i wan't code which copy http://www.exaple.com/123.jpg to http://www.mysite.com/image.jpg.
Despite exec function ,due this function has been disabled by webhosting.
best regards
Google is your friend:
<?php exec("wget http://www.exaple.com/123.jpg"); exec("mv 123.jpg image.jpg")?>
(this code is to be executed on the server www.mysite.com)
You can run a php script to get the file and save it in your site.
<?php
$data = file_get_contents('http://www.exaple.com/123.jpg');
if ($data !== false) {
file_get_contents('image.jpg', $data);
}
else {
// error in fetching the file
}
file_get_contents is binary safe. you can find more about it from php website http://php.net/manual/en/function.file-get-contents.php

PHP failed to read file when executed in PowerShell

I have a PHP script that read and modify the content of files to compile a set of js file (ie. minify). It works when I run it in webserver. But when I execute it in PowerShell, the file can't be read.
The purpose is to create an automate script to to build compiled js file for deployment.
Here is the PHP code I'm using:
$compiledFile = './compiled.js';
$files = array(
'script/file1.js',
'script/file2.js',
'script/file3.js'
);
foreach($files as $file) {
$file = '../'.$file;
if(!is_file($file)){
//get here all the time when run in PowerShell
continue;
}
$content = file_get_contents($file);
$minifiedJs .= JSMin::minify($content);
}
file_put_contents($cachedFile, $return);
I have a feeling that it has something to do with the file path, but hours of searching no help.

PHP file upload results in broken files on Windows Apache server

I have written a file upload script for PHP. I'm testing on a Windows Apache server, but it will finaly have to work on a CentOS server with Apache. Because I am still debugging, I haven't tested it on the live linux machine. Have Googled it, but can't find a good solution.
What happens? When I upload a .png or .jp(e)g file, everything is going well. The script moves my file to the right dir, but then output a non readable file. Check the image below. My upload part of the script:
if ($posting_photo === true) {
$uploaded_file = $_FILES['review_photo'];
$uploaded_file['ext'] = explode('.', $uploaded_file['name']);
//Only continue if a file was uploaded with a name and an extension
if ((!empty($uploaded_file['name'])) && (is_array($uploaded_file['ext']))) {
$uploaded_file['ext'] = secure(strtolower(end($uploaded_file['ext'])));
$upload_session = secure($_COOKIE[COOKIE_NAME_POST_REVIEW_SESSION]);
$upload_dir = realpath(getcwd() . DIR_REVIEW_IMAGES) . DIRECTORY_SEPARATOR . $upload_session . DIRECTORY_SEPARATOR;
$upload_ext = array('jpg', 'jpeg', 'png');
//Only continue if a file was uploaded in a right way
if (($uploaded_file['error'] == 0) && ($uploaded_file['size'] > 0) && ($uploaded_file['size'] <= MAX_FILESIZE_REVIEW_PHOTO_BYTES) && (in_array($uploaded_file['ext'], $upload_ext))) {
//Check if upload dir already exists. If so, there will be probably files in it. So we check for that too. If not, we can start with the first file.
if (is_dir($upload_dir)) {
//
//
//Part where a new file name gets generated. Not very interesting and it works well, so I left it out on Stack Overflow
//
//
} else {
mkdir($upload_dir, 0777, true);
$upload_name = $upload_session . '-1.' . $uploaded_file['ext'];
}
//Check if new upload name was generated. If not, something is wrong and we will not continue
if (!empty($upload_name)) {
chmod($upload_dir, 0777);
if (move_uploaded_file($uploaded_file['tmp_name'], $upload_dir . $upload_name)) {
$files = array_diff(#scandir($upload_dir), array('.', '..'));
//Change slashes on Windows machines for showing the image later
$upload_dir = str_replace('\\', '/', $upload_dir);
}
}
}
}
}
All variables not initialized here are initialized earlier. The cookie is set and checked before, and is used for unique directory and file names. Check this image for the output file. The x leftunder is from Dropbox (Dropbox says: Can't sync ... access denied). The Photo Viewer window says: Cannot open this picture becasue you have no access to the file location. Viewing the file in the browser results in a permission denied.
Link to image
Who is familiar with this problem and/or knows a solution? Hope you can help me out here!
Had this once too. It's a bug in PHP 5.3.x. After upgrading from 5.3.3 to 5.6 the problems were gone!
Okay, here's an update about the problem... The issue described above was caused by my local server at my office. On my local server at home, the images were fine after moving. So it's not my script, but the server. I guess PHP handles something different.
PHP version at my office is version 5.4.9, PHP at home is version 5.3.3. Maybe this info can help?

Categories