I'm trying to install ClamAV on Windows but I can't find how to.
What I want actually is to scan for malwares uploaded files and return a value like "safe" or "Infected by: X"
Do you think it's possible on Windows using a free library?
Do you know if there is a paid software that can do this (even using command-line)?
I managed to do it by installing ClamWin on the Windows 2008 Server. (clamwin-0.97.6). I created the eicar.txt file in order to test detection:
X5O!P%#AP[4\PZX54(P^)7CC)7}$EICAR-STANDARD-ANTIVIRUS-TEST-FILE!$H+H*
Created test.php file:
<?php
$file = 'C:/Users/Localadmin/Desktop/testfile/eicar.txt'; // infected test file
$db = '"C:/Documents and Settings/All Users/.clamwin/db/"'; // path to database of virus definition
$scan_result = shell_exec("D:/programs/clamwin/bin/clamscan --database=$db $file");
echo $scan_result;
?>
It gives me this result:
Eicar-Test-Signature FOUND
----------- SCAN SUMMARY -----------
Known viruses: 1568163
Engine version: 0.97.6
Scanned directories: 0
Scanned files: 1
Infected files: 1
Data scanned: 0.00 MB
Data read: 0.00 MB (ratio 0.00:1)
Time: 7.363 sec (0 m 7 s)
Than you can process the string $scan_result to figure out what number has been returned after 'Infected files: '.
I will be using it to scan files uploaded via form and since the scanning takes time (7 seconds) I will use some ajax script which can nicely return feedback to the user such "Uploading file..." and "Scanning for viruses..."
You can install clamav for windows (clamwin), and use php's passthru function to scan a file via commandline and get the output back. Parse it then display your message. You will have to adjust your php timeout value, or configure your application to upload, get the user to constantly refresh for the status while a background script scans and inserts the result into a database or something. Try looking at virustotal.com they do this, and scan it with over 20 av scanners.
Related
i'm trying to find a way to print pdf's which are downloaded from our php-based website.
currently we have to point every printer ourselves ( small labelprinter-large labelprinter-laserjet) , but we would like a workflow like this:
on the website you click the small label icon.
a pdf is generated with the label and gets a filename with .pdf stored on the fileserver in the users folder.
based the prefix and suffix of the file a monitoring program monitoring the folders sends a command to print to the printer specified in the file.
Basically i'm aiming for a kiosk printing mode, with a functionality for every user to specify which printer is nearby and should be used .
Is this a functionality which is easily acheived?
Doing this in PHP is quite hard, It would be easier to use Python
For windows
You have to make a python script which scans every 10 seconds (or less) a folder where you put all files to print.
while True:
files_in_dir = os.listdir(os.path.join('path to the scanned folder'))
time.sleep(10) # sleep 10 seconds before the new scan
Then, you can use gsprint to send the file to the printer
p = subprocess.Popen(
['path_to_gsprint.exe', '-printer', 'printer_name', 'fullpath_file'],
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
out, err = p.communicate()
if out:
print Tools.format_txt(out)
if err:
print Tools.format_txt(err, level="error")
Once the file is printed, delete it or move it to an other folder so it will not be scanned again.
os.rename(filename, 'path to printed folder' + filename) # move file
os.remove(filename) # delete file
For linux
On linux, the script is quite similar, using Python, you scan a folder and you sent the found files to the printer using lp command
lp -d PRINTER_NAME *.pdf
Again, once the files are printed, delete or move them.
With this technique, you will not get instant print, you will have to wait for the script to scan the folder but if you set a short time between scans, this won't be a problem
I have a CIFS share from Windows Server 2012 R2 mounted on Ubuntu 14.04.2 LTS (kernel 3.13.0-61-generic) like this
/etc/fstab
//10.1.2.3/Share /Share cifs credentials=/root/.smbcredentials/share_user,user=share_user,dirmode=0770,filemode=0660,uid=4000,gid=5000,forceuid,forcegid,noserverino,cache=none 0 0
The gid=5000 corresponds to group www-data which runs a PHP process.
The files are mounted correctly when I check via the console logged in as the www-data user - they are readable and removable (the operations that are used by the PHP script).
The PHP script is processing about 50-70 000 files per day. The files are created on the host Windows machine and some time later the PHP script running on the Linux machine is notified about a new file, checks if the file exists (file_exists), reads it and deletes. Usually all works fine, but sometimes (a few hundreds to 1-2 000 per day) the PHP script raises an error that the file does not exist. That should never be the case, since it is notified only of actually existing files.
When I manually check those files reported as not existing, they are correctly accessible on the Ubuntu machine and have a creation date from before the PHP script checked their existence.
Then I trigger the PHP script manually to pick up that file and it is picked up without problems.
What I already tried
There are multiple similar questions, but I seem to have exhausted all the advices:
I added clearstatcache() before checking file_exists($f)
The file and directory permissions are OK (exactly the same file is picked up correctly later on)
The path used for checking file_exists($f) is an absolute path with no special characters - the file paths are always of format /Share/11/222/333.zip (with various digits)
I used noserverino share mount parameter
I used cache=none share mount parameter
/proc/fs/cifs/Stats/ displays as below, but I don't know if there is anything suspicious here. The share in question is 2) \\10.1.2.3\Share
Resources in use
CIFS Session: 1
Share (unique mount targets): 2
SMB Request/Response Buffer: 1 Pool size: 5
SMB Small Req/Resp Buffer: 1 Pool size: 30
Operations (MIDs): 0
6 session 2 share reconnects
Total vfs operations: 133925492 maximum at one time: 11
1) \\10.1.2.3\Share_Archive
SMBs: 53824700 Oplocks breaks: 12
Reads: 699 Bytes: 42507881
Writes: 49175075 Bytes: 801182924574
Flushes: 0
Locks: 12 HardLinks: 0 Symlinks: 0
Opens: 539845 Closes: 539844 Deletes: 156848
Posix Opens: 0 Posix Mkdirs: 0
Mkdirs: 133 Rmdirs: 0
Renames: 0 T2 Renames 0
FindFirst: 21 FNext 28 FClose 0
2) \\10.1.2.3\Share
SMBs: 50466376 Oplocks breaks: 1082284
Reads: 39430299 Bytes: 2255596161939
Writes: 2602 Bytes: 42507782
Flushes: 0
Locks: 1082284 HardLinks: 0 Symlinks: 0
Opens: 2705841 Closes: 2705841 Deletes: 539832
Posix Opens: 0 Posix Mkdirs: 0
Mkdirs: 0 Rmdirs: 0
Renames: 0 T2 Renames 0
FindFirst: 227401 FNext 1422 FClose 0
One pattern I think I see is that the error is raised only if the file in question has been already processed (read and deleted) earlier by the PHP script. There are many files that have been correctly processed and then processed again later, but I have never seen that error for a file that is processed for the first time. The time between re-processing varies from 1 to about 20 days. For re-processing, the file is simply recreated under the same path on the Windows host with updated content.
What can be the problem? How can I investigate better? How can I determine if the problem lies on the PHP or OS side?
Update
I have moved the software that produces the files to a Ubuntu VM that mounts the same shares the same way. This component is coded in Java. I am not seeing any issues when reading/writing to the files.
Update - PHP details
The exact PHP code is:
$strFile = zipPath($intApplicationNumber);
clearstatcache();
if(!file_exists($strFile)){
return responseInternalError('ZIP file does not exist', $strFile);
}
The intApplicationNumber is a request parameter (eg. 12345678) which is simply transformed to a path by the zipPath() function (eg. \Share\12\345\678.zip - always a full path).
The script may be invoked concurrently with different application numbers, but will not be invoked concurrently with the same application number.
If the script fails (returns the 'ZIP file does not exist' error), it will be called again a minute later. If that fails, it will be permanently marked as failed. Then, usually more than an hour later, I can call the script manually with the same invocation (GET request) that it's done on production and it works fine, the file is found and sent in the response:
public static function ResponseRaw($strFile){
ob_end_clean();
self::ReadFileChunked($strFile, false);
exit;
}
protected static function ReadFileChunked($strFile, $blnReturnBytes=true) {
$intChunkSize = 1048576; // 1M
$strBuffer = '';
$intCount = 0;
$fh = fopen($strFile, 'rb');
if($fh === false){
return false;
}
while(!feof($fh)){
$strBuffer = fread($fh, $intChunkSize);
echo $strBuffer;
if($blnReturnBytes){
$intCount += strlen($strBuffer);
}
}
$blnStatus = fclose($fh);
if($blnReturnBytes && $blnStatus){
return $intCount;
}
return $blnStatus;
}
After the client receives the file, he notifies the PHP server that the file can be moved to an archive location (by means of copy() and unlink()). That part works fine.
STRACE result
After several days of no errors, the error reappeared. I ran strace and it reports
access("/Share/11/222/333.zip", F_OK) = -1 ENOENT (No such file or directory)
for some files that do exist when I run ls /Share/11/222/333.zip from the command line. Therefore the problem is on the OS level, PHP is not to be blamed.
The errors started appearing when the load on the disk on the host increased (due to other processes), so #risyasin's suggestion below seems most likely - it's a matter of busy resources/timeouts.
I'll try #miguel-svq's advice of skipping the existence test and just going for fopen() right away and handling the error then. I'll see if it changes anything.
You can try to use the directio option to avoid doing inode data caching on files opened on this mount:
//10.1.2.3/Share /Share cifs credentials=/root/.smbcredentials/share_user,user=share_user,dirmode=0770,filemode=0660,uid=4000,gid=5000,forceuid,forcegid,noserverino,cache=none,directio 0 0
This is hardly a definitive answer to my problem, rather a summary of what I found out and what I settled with.
At the bottom of the problem lies that it is the OS who reports that the file does not exist. Running strace shows occasionally
access("/Share/11/222/333.zip", F_OK) = -1 ENOENT (No such file or directory)
for the files that do exist (and show up when listed with ls).
The Windows share host was sometimes under heavy disk load. What I did is move one of the shares to a different host so that the load is spread now between the two. Also, the general load on the system is a bit lighter lately. Whenever I get the error about file not existing, I retry the request some time later and it's no longer there.
I want to zip a large folder of 50K files on Windows Server. I'm currently using this code:
include_once("CreateZipFile.inc.php");
$createZipFile=new CreateZipFile;
$directoryToZip="repository";
$outputDir=".";
$zipName="CreateZipFileWithPHP.zip";
define("ZIP_DIR",1); //
if(ZIP_DIR)
{
//Code toZip a directory and all its files/subdirectories
$createZipFile->zipDirectory($directoryToZip,$outputDir);
}else
{
//?
}
$fd=fopen($zipName, "wb");
$out=fwrite($fd,$createZipFile->getZippedfile());
fclose($fd);
$createZipFile->forceDownload($zipName);
#unlink($zipName);
Everything works fine until around 2K image files. But this is not what I want to get. I'm willing to process to zip like 50K images at least. Meanwhile my script gets this error:
Fatal error: Maximum execution time of 360 seconds exceeded in C:\xampp\htdocs\filemanager\CreateZipFile.inc.php on line 92
$newOffset = strlen(implode("", $this->compressedData));
I'm searching for any solution to proceed such a huge amount of files. I currently use XAMPP on Windows Server 2008 Standard. Is there any possibility to make small parts of the zips, use a system command and maybe external tool to pack them and then send it to header to download?
http://pastebin.com/iHfT6x69 for CreateZipFile.inc.php
try this .. to increase execution time
ini_set('max_execution_time', 500);
500 is number os seconds change it to whatever you lilke
Do you need a smaller file or a fast served file?
for fast serving without compression and without memory leak you could try to use the system command with a zip software like gzip and turning the compression of.
the files would probably get huge but would be served fast as one file.
I have some code that copies a file to a temporary location where it is later included in a zip file.
I already have the source files stored in a local cache directory, and have also stored the SHA1 hash of the original files. The files in question are .png images, ranging from a few kb to around 500kb.
My problem is that at high server loads, the copy intermittently fails. Upon examining my logs, I see that even though a healthy file exists in the source location, the destination contains a file with zero bytes.
So, to try and figure out what was going on and to increase reliability, I implemented a SHA1 check of the destination file, so that if it fails, I can retry the copy using the shell.
99.9% of the time, the files copy with no issue. Occasionally, the first copy fails but then the second attempt succeeds. In a few number of cases (around 1 in 2,500; and always at high server load), both copies will fail. In nearly all these cases SHA1 of the destination file is da39a3ee5e6b4b0d3255bfef95601890afd80709 which is consistent with an empty file.
In all occasions, the script continues, and the created zip includes an empty image file. There is nothing in the Nginx, PHP or PHP-FPM error logs that indicates any problem. The script will copy the same file successfully when retried.
My stack is Debian Squeeze with the .deb PHP 5.4/PHP 5.4 FPM packages and Nginx 1.2.6 on an Amazon EBS backed AMI. The file system is XFS and I am not using APC or other caching. The problem is consistent and replicable at server loads >500 hits per second.
I cannot find any documentation of known issues that would explain this behaviour. Can anyone provide any insight into what may be causing this issue, or provide suggestions on how I can more reliably copy an image file to a temporary location for zipping?
For reference, here is an extract of the code used to copy / recopy the files.
$copy = copy ($cacheFile, $outputFile);
if ($copy && file_exists($outputFile) && sha1_file($outputFile) !== $storedHash) {
// Custom function to log debug messages
dbug(array($cacheFile, sha1_file($cacheFile),
$storedHash, $outputFile,
file_exists($outputFile), filesize($outputFile)),
'Corrupt Image File Copy from Cache 1 (native)');
// Try with exec
exec ("cp " . $cacheFile . " " . $outputFile);
if (file_exists($outputFile) && sha1_file($outputFile) !== $storedHash)
dbug(array($cacheFile, sha1_file($cacheFile),
$storedHash, $outputFile,
file_exists($outputFile), filesize($outputFile)),
'Corrupt Image File Copy from Cache 2 (shell exec)');
}
Objective: Use PHP to call a vbs that converts an xls/xlsx file to a csv.
Question: How can I pass a source file path and a destination file path to a vbs that converts xls/xlsx to csv and run that vbs in a PHP web application?
Details: I have a working vbs that takes a source file path and a destination file path and converts the xls/xlsx at source file path into a csv. I can execute it from the Windows cmd line and it does exactly what I want it to do. I can also put the execution command into a bat file and run the bat file to achieve the same results. However, when I use exec()/shell_exec()/system() in PHP to execute the same command no csv is created. (If I try to run the bat from PHP using system() the contents of the bat file show up on the page, in fact, echo Conversion complete! prints "echo Conversion complete! Conversion complete.") I haven't seen any errors yet.
Note: I know about PHPExcel, I'd prefer not to use it.
excelToCsv.vbs
On Error Resume Next
if WScript.Arguments.Count < 2 Then WScript.Echo "Please specify the source and the destination files. Usage: ExcelToCsv <xls/xlsx source file> <csv destination file>"
Wscript.Quit
End If
csv_format = 6
Set objFSO = CreateObject("Scripting.FileSystemObject")
src_file = objFSO.GetAbsolutePathName(Wscript.Arguments.Item(0))
dest_file = objFSO.GetAbsolutePathName(WScript.Arguments.Item(1))
Dim oExcel
Set oExcel = CreateObject("Excel.Application")
Dim oBook
Set oBook = oExcel.Workbooks.Open(src_file)
oBook.SaveAs dest_file, csv_format
oBook.Close False
oExcel.Quit
batConverter.bat
excelToCsv.vbs conversionTestSourceMS2003.xls batTest.csv
echo Conversion Complete!
index.phtml
<?php
system("cmd /c batConvert.bat")
?>
Note: All of the above files (along with conversionTestSourceMS2003.xls) are in the same directory. I have not implemented any way to pass the parameters (since I can't get it to work even if it's all hard coded...)
Set Up: PHP5, Zend Framework, WAMP, Windows 7 (localhost).
For the sake of simplicity, I merged everything into a single ASP page. This will allow me to hopefully see a similar problem in IIS, and since it is in a single ASP script, I will be able to see the error more directly. My test machine is running on Windows Vista SP2 on IIS7 with Excel 2007 SP3.
excelToCsv.asp
<%
Option Explicit
Dim csv_format, src_file, dest_file, strPath, objFSO
csv_format = 6
src_file = "conversionTestSourceMS2003.xls"
dest_file = "testbat.csv"
strPath = "[HARDCODED PATH HERE]\"
src_file = strPath & src_file
dest_file = strPath & dest_file
Dim objExcel, objBook
Set objExcel = CreateObject("Excel.Application")
Set objBook = objExcel.Workbooks.Open(src_file)
objBook.SaveAs dest_file, csv_format
objBook.Close False
Response.Write "Conversion Complete!"
objExcel.Quit
%>
When running this code, I got a generic ASP error. So, I enabled detailed error messages in ASP and I get this following error...
Microsoft Office Excel error '800a03ec'
Microsoft Office Excel cannot access the file '[HARDCODED PATH
HERE]\conversionTestSourceMS2003.xls'. There are several possible
reasons: • The file name or path does not exist. • The file is being
used by another program. • The workbook you are trying to save has the
same name as a currently open workbook.
/temp/ExcelToCsv.asp, line 18
Now, this is not Apache, but I do believe the problem is related to yours. This error implies there is a security/permission problem where Excel cannot do what it needs to do to access or read the file. In fact, I encountered similar errors when I was executing the VBScript (and passing the error up the chain) from PHP (in IIS).
I believe it can be resolved by changing the Windows User being used to create the process. This can be configured in Services.msc by editing the Apache service and changing the Log On tab to an actual Windows user instead of a Service Account. I have not tested it yet, though, since setting up Apache is not something I can do right now.