Add AV Scan to File Upload Site - php

i am not sure whether this is in the right section or not but i am building an file upload site and want to be able to scan the files on upload for viruses etc.. How would i be able to do this?
Any ideas to get me started?
Thanks

The clamav library has a PHP binding called php-clamav. You then can scan files for viruses from within your PHP code:
if ($_FILES['file']['size'] == 0 || !is_file($_FILES['file']['tmp_name']))
{
throw new Exception('Please select a file for upload!');
} else {
cl_setlimits(5, 1000, 200, 0, 10485760);
if ($malware = cl_scanfile($_FILES['file']['tmp_name']))
throw new Exception($malware.'(ClamAV version: '.clam_get_version(),')');
}
...
Another alternative is to install the Mod_Security web application firewall. It can be configured to scan all upload files for viruses using modsec-clamscan.

You could try something like the following using AVG:
Windows:
<?php
exec("avgscanx.exe /SCAN=filename.ext/");
$result = exec("echo %ERRORLEVEL%");
?>
Linux:
<?php
exec("avgscan filename.ext -a -H -c");
$result = exec("echo $?");
?>
Both platforms return the same error codes, allowing you to determine whether a scan was successful or not.
References:
http://www.avg.com/ww-en/faq.num-4443
http://www.avg.com/ww-en/faq.num-4441
http://www.avg.com/ww-en/faq.num-1854
http://www.avg.com/ww-en/faq.num-1759

It depends on your server configuration, but for example on linux, it's easy to install something like clam and access it through the command line. You can use something like php's exec() to run it.

You could also use VirusTotals public API. You can read more about it here. There is some PHP code available here.
This way you get a lot of scanners, and you don't have to run AV locally. On the other hand you'll have to wait a while for the result.

Related

Check PDF number pages with PHP (in Linux)

I have a webpage where I let users to upload files to the account folder. Exactly PDF and JPG files only. I want to count the number of pages inside each PDF uploaded to show it to the users.
To do this, I was using PDFINFO linux library, part of XPDF proyect.
This is the man page of the binary file: http://linuxcommand.org/man_pages/pdfinfo1.html
You can download the .zip with the binaries there: http://www.foolabs.com/xpdf/download.html
My code (this worked perfectly, but yesterday it failed):
function getNumPagesInPDF($document){
if(!file_exists($document))return null;
$cmd = "pdfinfo";
// Open the document
exec($cmd." '".$document."'", $output);
// Browse the data
$pagecount = 0;
foreach($output as $op){
// Extrac number of pages
if(preg_match("/Pages:\s*(\d+)/i", $op, $matches) === 1){
$pagecount = intval($matches[1]);
break;
}
}
return $pagecount;
}
I can run the command in SSH, and it works in the server. Now, this code doesn't work in PHP, but nothing changed the code.
AH! a little addition: I checked exec works in my PHP using:
function exec_enabled() {
$disabled = explode(',', ini_get('disable_functions'));
return !in_array('exec', $disabled);
}
if (exec_enabled()){
echo "exec funciona";
}
Another addition: PHP didn't shows any error related with that and I have the error logging enabled to a log file (including warnings). My host recently activated mod_security.
TASK1: Try $document variable: the path is ok, relative to the place where the php code file is placed. The path exists and the file too.
TASK2: Check if $output variable has anything: NO, $output array is empty! Why? cannot understand.
TASK3: Check the $cmd." '".$document."'" : it's ok, and copied the "result" to ssh works. I'm lost.
As per the comment discussion, we've seen that running a binary using a bare filename does not always work. This is as true on the console as it is inside a system command like exec().
When you run pdfinfo in either environment, the system will search through the environment variable PATH to discover which directories to find it in. This variable is nearly always different between your user account and the Apache environment, which is why it is important to always specify the fully-qualified filename when running a binary programmatically.
As far as I know, exec() does not regard the folder containing the current PHP script as the current working directory. Even if it did, the current directory . would need to be in the Apache user's PATH in order for this to be found. Thus, I am not sure why this used to work for you, but it emphasises the importance of the above lesson: always use the full path.
You should also read the path from a settings file, rather than hardwiring it in code. This will help you as you move from local, test, staging and live environments of your app, which may store this binary in different locations.

Is it possible to download a file from one server (sftp) and upload it to my server using php?

I have been told this cannot be done but I want to get some other opinions here. I am a bit of a newbie when it comes to things like this.
My Site: ExampleSiteA.com
File to download: ExampleSiteB.com
Basically, I am downloading a csv file from ExampleSiteB.com to make updates to my site, ExampleSiteA.com. To do this, I am downloading the csv file manually through CoreFTP and then uploading it manually to ExampleSiteA.com. The file changes daily and I would like to skip this step so I can automate the process.
Keep in mind that I need to download the csv file from ExampleSiteB.com through SFTP...
I am not sure if it is possible to directly download/upload a file from one server to another if one is SFTP. The file size is also quite large, it averages about 25,000 KB / 25 MB.
Another option that I haven't explored yet is requiring or including a file from another server... is that an option or a possibility? The file is located in a folder exclusively for my site and a login is required for SFTP download.
Any insight will be appreciated. Thanks in advance!
Go here and download what you need: http://phpseclib.sourceforge.net/
UPDATE
FOR SFTP
Then in your script:
<?php
include('Net/SFTP.php');
$url = 'http://www.downloadsite.com';
$fileToDownload = "yourCSV.csv";
$cmd = "wget -q \"$url\" -O $fileToDownload";
exec($cmd);
$sftp = new Net_SFTP('www.uploadsite.com');
if (!$sftp->login('username', 'password')) {
exit('Login Failed');
}
echo $sftp->pwd() . "\r\n";
$sftp->put('remote.file.csv', 'yourCSV.csv', NET_SFTP_LOCAL_FILE);
print_r($sftp->nlist());
?>
If you need to connect to a second server for download:
$sftp2 = new Net_SFTP('www.serverFromWhichToDownload.com');
if (!$sftp2->login('username', 'password')) {
exit('Login Failed');
}
echo $sftp2->pwd() . "\r\n";
$sftp2->get('localFileName.csv', 'remoteFileName.csv');
print_r($sftp2->nlist());
Read the docs for further help and examples: http://phpseclib.sourceforge.net/documentation/net.html#net_sftp_get
To Log what your connection is doing if it fails, etc. use this:
include('Net/SSH2.php');
define('NET_SSH2_LOGGING', true);
$ssh = new Net_SSH2('www.domain.tld');
$ssh->login('username','password');
echo $ssh->getLog();
FOR FTP upload - SO has gone crazy, does not want to format my code, but here it is anyway:
$file = 'somefile.txt';
$remote_file = 'readme.txt';
$conn_id = ftp_connect($ftp_server);
$login_result = ftp_login($conn_id, $ftp_user_name, $ftp_user_pass);
if (ftp_put($conn_id, $remote_file, $file, FTP_ASCII)) {
echo "successfully uploaded $file\n";
} else {
echo "There was a problem while uploading $file\n";
}
ftp_close($conn_id);
Yes, that's possible using ssh2_sftp.
http://php.net/manual/en/function.ssh2-sftp.php
I have had good luck with cURL in the past. If you are on a Linux box, it would be trivial to set up a CRON job to do this update process for you. A good reference for CLI HTTP scripting in cURL can be found here, however you may need the -T flag (for file transport) to accomplish the upload portion. Speaking of uploading, if you can run the script/process/crontab from the server you would like to update, I would recommend downloading from the web server to obviate one trip and a third party. Or, if you need to update on demand, you could write a PHP script that uses the built in PHP cURL functions. If you take the Linux+CLI route, you could also use sftp.
Update: In testing curl with sftp (curl -u uname:pword sftp://domain.tld) I get the following error: curl: (1) Protocol sftp not supported or disabled in libcurl on Kubuntu 12.04. So cURL may not be a good idea. I also tested CLI sftp (sftp uname#domain.tld:/dir/file.ext) but could not find a way (short of using ssh keys) to send authentication. Thus, this would necessarily be a manual process unless you did set up ssh keys between the servers. As it does not sound like you have that kind of access to ExampleSiteB.com, this probably isn't acceptable.
Update 2: Since my initial answer turned out to be of little use, I figured I would expand upon one of the above answers. I was trying to find a solution that did not involve a PECL extension, but I did not have much luck with ftp_ssh_connect(). I recommend trying it, you may have better luck and could forgo the PECL extension route.
Sigh, on further reading, it appears ftp_ssh_connect is, understandably, incompatible with the sftp protocol. However, I found a nice blog post about utilizing ssh2_connect() and ssh2_sftp() (as mentioned in a previous answer) and figured I would post that to give you some additional assistance. It is not as simple as calling the functions for most PHP distributions. Here is the blog post. Some of those steps may not be necessary or you may need to do some additional things listed in another blog post I ran across, here.
On my system, all I had to do was run apt-get install libssh2-1-dev libssh2-php and I was able to find ssh2 in my php -m output.
Having an include, as long as you have read/write permissions on the website you're getting the file from should work, however this is just guess work atm as i don't have any means of checking it. Good luck though!
Yes, you should be able to do this.
Whoever told you that you can't do this might be getting confused with JavaScript and cross-site scripting browser restrictions which prevent JavaScript downloaded from one domain to access content in a different domain.
That being said, if you are using PHP which to me implies that you are talking about PHP running on a web sever, you should be able to use PHP or any other scripting or programming language to download the file from SiteB.com, then update the file, and then finally FTP the file to a different web server (SiteA.com).

PHP script not firing when aliased to by Postfix - I'm at wits' end

I've tried to include as much info in this post as possible.
I'm using Postfix on an Amazon EC2 Ubuntu server and it seems that a PHP script I have aliased to an address isn't firing. Mailing works fine but the script just isn't firing. I've probably missed something easy and would appreciate any other ideas with this.
The code below is that of the script. At the moment it is just a basic script to write to a file the contents of php://stdin. I'm not sure if this is the best way to write this script but it seems to be ok for now as it's just a temporary one to use for troubleshooting this problem.
#!/usr/bin/php -q
<?php
$data = '';
$fileName = "parsedData.txt";
$stdin = fopen('php://stdin', 'r');
$fh = fopen($fileName, 'w');
while(!feof($stdin))
{
$data .= fgets($stdin, 8192);
}
fwrite($fh, $data);
fclose($stdin);
fclose($fh);
?>
I have verified this works by passing it a .txt file containing some text.
./test2.php < data.txt
Now that my PHP script seems to work fine locally, I need to make sure it is being called correctly. sudo chmod 777 has been run on the test2.php script. Here is the relevant /etc/aliases file entry.
test: "|/usr/bin/php -q /var/test/php/test2.php"
I run newaliases every time I change this. This seems to be the most correct syntax as it specifies the location of php fully. test#mydomain receives emails fine from both internal and external when it is not set to be aliased. According to syslog this is successfully delivered to the command rather than maildir.
postfix/local[2117]: 022AB407CC: to=<test#mydomain.com>, relay=local, delay=0.5, delays=0.43/0.02/0/0.05, dsn=2.0.0, status=sent **(delivered to command: /usr/bin/php -q /var/test/php/test2.php)**
The alias has also been written in the following ways without success (either because they go to maildir instead of the command due to wrong syntax or the script just isn't firing).
test: |"php -q /var/test/php/test2.php"
test: "|php -q /var/test/php/test2.php"
test: |"/usr/bin/php -q /var/test/php/test2.php"
test: "| php -q /var/test/php/test2.php"
test: "|/var/test/php/test2.php"
test: |"/var/test/php/test2.php"
The relevent part of my postfix main.cf files looks like this -
myhostname = domainnamehere.com
alias_maps = hash:/etc/aliases
alias_database = hash:/etc/aliases
myorigin = /etc/mailname
mydestination = domainnamehere.com, internalawsiphere, localhostinternalaws, localhost
relayhost =
mynetworks = 127.0.0.0/8 [::ffff:127.0.0.0]/104 [::1]/128
mailbox_command =
home_mailbox = Maildir/
mailbox_size_limit = 0
recipient_delimiter = +
inet_interfaces = all
inet_protocols = all
I had error_log("script has started!"); at the beginning of test2.php so that it would appear in the php log file if the script was successfully being called. I made sure to go into the php.ini to turn on error_logging and specify a location for it to save to but I couldn't get this to work. It would help to be able to get this to work because I could tell if the script was failing when it go to the php://stdin function as there may be some problem with it handling emails rather than cat'ed .txt files.
My end goal is to get a service that will save emails on a certain addresss, with their attachments, to a MySQL database and then return a unique code for each file/email to the user. Is there an easier way to do something like this than use php scripts? does something like squirrelmail or dbmail do this?
I've completely exhausted my ideas with this one. Maybe I should just try another email service?
Oh wise people of StackOverflow, help me!
First things first, chmod 777 <foo> is almost always a gigantic mistake. I recognize that you're in a state of desperation -- as indicated by the fact that you ran this command. You want to configure your systems with the least amount of required privilege for each portion of the system to properly do its job. This helps prevent security breaches and reduces needless coupling. But executable files should not be writable by anyone except the executable owner -- and even then, I strongly recommend against it.
Now, onto your problem:
#!/usr/bin/php -q
<?php
$data = '';
$fileName = "parsedData.txt";
You're referring to a file using a relative pathname. This is fine, if you're always confident of the directory in which it starts, or if you want the user to be in control of the directory in which it starts, but it is usually a bad idea for automated tools. The /etc/aliases mechanism may run the aliased commands in the postfix home directory, it might pick an empty directory in /var created just for the purpose, and future releases are more or less free to change this behavior as they wish. Change this path name to an absolute pathname showing exactly where you would like this file to be created -- or insert an explicit chdir() call at the start of your script to change directories to exactly where you want your data to go.
Next:
$fh = fopen($fileName, 'w');
while(!feof($stdin))
{
$data .= fgets($stdin, 8192);
}
fwrite($fh, $data);
You did not ensure that the file was actually opened. Check those return values. They will report failure for you very quickly, helping you find bugs in your code or local misconfigurations. I do not know PHP well enough to tell you the equivalent of the perror(3) function that will tell you exectly what failed, but it surely can't be too difficult to get a human-readable error code out of the interpreter. Do not neglect the error codes -- knowing the difference between Permission Denied vs File or directory not found can save hours once your code is deployed.
And, as Michael points out, a mandatory access control tool can prevent an application from writing in specific locations. The "default" MAC tool on Ubuntu is AppArmor, but SELinux, TOMOYO, or SMACK are all excellent choices. Run dmesg and look in /var/log/audit/audit.log to see if there are any policy violations. If there are, the steps to take to fix the problem vary based on which MAC system you're using. Since you're on Ubuntu, AppArmor is most likely; run aa-status to get a quick overview of the services that are confined on your system, and aa-logprof should prompt you to modify policy as necessary. (Don't blindly say "Allow", either -- perhaps active exploit attempts have been denied.)
(If you aren't using a MAC system already, please consider doing so. I've worked on the AppArmor project for twelve years and wouldn't consider not confining all applications that communicate over the network -- but that's my own security needs. More paranoid people may wish to confine more of their systems, less paranoid people may wish to confine less of their systems.)

How to view PHP or Apache error log online in a browser?

Is there a way to view the PHP error logs or Apache error logs in a web browser?
I find it inconvenient to ssh into multiple servers and run a "tail" command to follow the error logs. Is there some tool (preferably open source) that shows me the error logs online (streaming or non-streaming?
Thanks
A simple php code to read log and print:
<?php
exec('tail /var/log/apache2/error.log', $error_logs);
foreach($error_logs as $error_log) {
echo "<br />".$error_log;
}
?>
You can embed error_log php variable in html as per your requirement. The best part is tail command will load the latest errors which wont make too load on your server.
You can change tail to give output as you want
Ex. tail myfile.txt -n 100 // it will give last 100 lines
See What commercial and open source competitors are there to Splunk? and I would recommend https://github.com/tobi/clarity
Simple and easy tool.
Since everyone is suggesting clarity, I would also like to mention tailon. I wrote tailon as a more modern and secure alternative to clarity. It's still in its early stages of development, but the functionality you need is there. You may also use wtee, if you're only interested in following a single log file.
You good make a script that reads the error logs from apache2..
$apache_errorlog = file_get_contents('/var/log/apache2/error.log');
if its not working.. trying to get it with the php functions exec or shell_exec and the command 'cat /var/log/apache2/error.log'
EDIT: If you have multi servers(i quess with webservers on it) you can create a file on the machine, when you make a request to that script(hashed connection) you get the logs from that server
I recommend LogHappens: https://loghappens.com, it allows you to view the error log in web, and this is what it looks like:
LogHappens supports kinds of web server log format, it comes with parses for Apache and CakePHP, and you can write your own.
You can find it here: https://github.com/qijianjun/logHappens
It's open source and free, I forked it and do some work to make it work better in dev env or in public env. That is:
Support token for security, one can't access the site without the token in config.php
Support IP whitelists for security and privacy
Sopport config the interval between ajax requests
Support load static files from local (for local dev env)
I've found this solution https://code.google.com/p/php-tail/
It's working perfectly. I only needed to change the filesize, because I was getting an error first.
56 if($maxLength > $this->maxSizeToLoad) {
57 $maxLength = $this->maxSizeToLoad;
58 // return json_encode(array("size" => $fsize, "data" => array("ERROR: PHPTail attempted to load more (".round(($maxLength / 1048576), 2)."MB) then the maximum size (".round(($this->maxSizeToLoad / 1048576), 2) ."MB) of bytes into memory. You should lower the defaultUpdateTime to prevent this from happening. ")));
59 }
And I've added default size, but it's not needed
125 lastSize = <?php echo filesize($this->log) || 1000; ?>;
I know this question is a bit old, but (along with the lack of good choices) it gave me the idea to create this tiny (open source) web app. https://github.com/ToX82/logHappens. It can be used online, but I'd use an .htpasswd as a basic login system. I hope it helps.

PHP - Create directory on different server

I have Wamp (server called emerald) running and Mamp running on my Mac. People register on Mamp. Emerald is basically file hosting.
Emerald connects to Mamp's mysql database, to login users. However, I want to create a directories for new registrations on Emerald using PHP.
How can I do this? I have tried using this code:
$thisdir = "192.168.1.71";
$name = "Ryan-Hart";
if(mkdir($thisdir ."/documents/$name" , 0777))
{
echo "Directory has been created successfully...";
}
But had no luck. It basically needs to connect the other server and create a directory, in the name of the user.
I hope this is clear.
You can't create directories through http. You need a filesystem connection to the remote location (a local hard disk, or a network share for example).
The easiest way that doesn't require setting up FTP, SSH or a network share would be to put a PHP script on Emerald:
<?php
// Skipping sanitation because it's only going to be called
// from a friendly script. If "dir" is user input, you need to sanitize
$dirname = $_GET["dir"];
$secret_token = "10210343943202393403";
if ($_GET["token"] != $secret_token) die ("Access denied");
// Alternatively, you could restrict access to one IP
error_reporting(0); // Turn on to see mkdir's error messages
$success = mkdir("/home/www/htdocs/docs/".$dirname);
if ($success) echo "OK"; else echo "FAIL";
and call it from the other server:
$success = file_get_contents("http://192.168.1.71/create_script.php?token=10210343943202393403&dir=HelloWorld");
echo $success; // "OK" or "FAIL"
Create a script on another server that creates the dir and call it remotely.
Make sure you have security check (+a simple password at least)
There is no generic method to access remote server filesystems. You have to use a file transfer protocol and server software to do so. One option would be SSH, which however requires some setup.
$thisdir = "ssh2.sftp://user:pass#192.168.1.71/directory/";
On Windows you might get FTP working more easily, so using an ftp:// url as directory might work.
As last alternative you could enable WebDAV (the PUT method alone works for file transfers, not creating directories) on your WAMP webserver. (But then you probably can't use the raw PHP file functions, probably needs a wrapper class or curl to utilize it.)
I know this is old but i think this might me useful, in my experience:
if(mkdir($thisdir ."/documents/name" , 0777))
doesn't work, i need to do it:
mkdir($thisdir, 0777);
mkdir($thisdir ."/documents" , 0777);
mkdir($thisdir ."/documents/name" , 0777));
hope it helps :)

Categories