How to log activities done using ssh [duplicate] - php

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Best way to monitor file system changes in linux
I need your help. How can I log activities done using SSH on a linux server - activities like create file or dir, delete file or dir, rename file or dir for a particular path. I need some solution any bash, python or php script or if there is any option in linux using which I can watch all activities done on a particular path or folder. I need to use those logs for syncing purpose.
OK, Let me explain you the entire scenario. I am working on sync tool we are using Samba for sharing all the files and folders and I need these files to be synced across the network. I grep samba log to watch the activities done by clients like create file or folder, delete file and folder and rename file or folder I am using these log for my syncing tool and its working fine. But I am only getting logs when changes are done using samba - if the change are done using SSH those activities are not logged and will not be synced. So I need to grep log for the changes made using SSH for a particular path (for example: /mnt/test) - changes made in test folder like create delete and rename.

As I understand what's happening here is this.
There is a Samba server that exports a filesystem with multiple users and a bunch of users that have direct access to the filesystem (they log on via SSH). And this filesystem needs to be replicated to another location.
The TS is developing a tool to perform the replication.
There are at least two options here.
A more conventional way to do this would be to run rsync between the two locations at regular intervals. Thus the replicas will not be always consistent, but it is easy and the system is partition tolerant and available. That is this this method chooses "A" and P" from the "CAP" theorem.
Another method inspired by the popularity of Dropbox-like cloud storage and instant replication is to watch the filesystem. That can be accomplished with inotify or fam.
The interfaces for inotify are available in most scripting languages including Perl, Python and PHP. This trades consistency for availability. That is until a large file has been replicated it would not be accessible on the other side.
The interfaces for FAM are available in PHP and probably other languages. See the linked question for a discussion of different filesystem monitoring APIs.
The first option is essentialy a one-liner. The second option should not be too hard either (look at the Dropbox daemon sources for an example).
Note: Replication is a recurring topic at Serverfault.com.

Related

Connect file scan protection to my site

I am looking into implementing a virus scanner into a web application I am creating that allows the user to upload files. All the background functionality is completed, however I am wary that a malicious user may upload a virus and other users download that file.
I have been researching for a few months now on how to implement a scanner into my web application.
I have located various online virus scanners such as MetaScan-Online and VirusTotal.
I have read into the documentation that they have provided, however I am still confused and am unsure if I can implement these applications into my applications using the API.
Can I?
And if so, is there another virus scanner that enables a whole folder of files to be scanned simultaneously?
If the anti-virus force is strong in you, then you can probably implement a service class and upload the incoming files to one of the public scan services.
Keep in mind that they are limiting the accepted file size and number of files and that they don't store the scan reports forever.
MetaScan
The public API for Metascan is described here: https://www.metascan-online.com/public-api#!/
There is also a PHP checker available. But it uses v1 of their API and looks outdated. Maybe contact them to get an update version using API v2.
https://www.metascan-online.com/apps#!/php-metascan-online-checker
VirusTotal
The public API is described here: https://www.virustotal.com/de/documentation/public-api/
There are multiple libraries for PHP available, just to mention one of them
https://github.com/jayzeng/virustotal_apiwrapper
Local clamAV scan after upload
Another solution is to simply trigger a clamav scan by using clamscan after a file was uploaded to your server. That means upload to sandboxed av-scan folder, scan, drop (if bad), or keep (if ok) and finally move to upload folder. This is slow, because signatures have to be loaded each time the clamscan command is invoked from PHP.
With sandbox folder i mean an restricted system environment for controlling the resources better, e.g. an "upload directory" with restricted or removed read/write/exec permissions (user/group/other), isolated from the application, not accessible from the web and with restricted PHP capabilities (think of disable_functions directive).
When your server runs a clamav daemon (clamd) its possible to invoke clamdscan on the folder. This solution is fast, because the signatures a kept in memory.
You'll have to handle the sending of folder yourself, they'll not bez able (as an external service) to get the list of file for a defined folder.
VirusTotal is providing a public API to scan your file, it's a good start. You could implement multi threading and store the result for each file. This way you'll avoid sending multiple time the same file.

Securely Allow PHP Read & Write Access to System Files

I have not been able to find solid information on preferred (best practices) and/or secure methods to allow php to access config or other types of files on a linux server not contained in the public web directory or owned by the apache user so I'm hoping to find some answers here.
I am a fairly competent PHP programmer but am increasingly tasked with writing web applications (most of which are not publicly accessible via the web however) that require updating, changing or adding to config files or files generated by some service or application on the server.
For instance, I need to create a web interface that will view, add or remove entries from a /etc/mail/spamassassin/white-list.cf file owned by root.
Another scenario is that I need php to parse mime messages in /var/vmail that are owned by user vmail.
These are just a couple examples, there will be other files in locations owned by other processes/users. How can I write PHP applications that securely access and manipulate these files without opening security risks?
If I were needing to implement something like this, I would probably look at using something like sudo to fine-tune permissions. I'm not a Linux CLI expert, so I'm sure there are issues that I haven't taken into account when typing this out.
I would probably determine what tasks need to be done, and would write a separate script for each task that needs to be completed. Using sudo, I'd assign the necessary level of permissions for that script only.
Obviously, as the number of tasks increase, so would the complexity and the amount of work involved. I'm not sure how this would affect you at the moment.

Execute shell commands with sudo via PHP

So far my search has shown the potential security holes that will be made while trying to perform a sudo'd command from within PHP.
My current problem is that I need to run a bash script as sudo on my work web server via PHP's exec() function. We currently host a little less than 200 websites. The website that will be doing this is restricted to only be accessible from my office's IP address. Will this remove any potential security issues that come with any of the available solutions?
One of the ways is to add the apache user to the sudoers file, I assume this will apply to the entire server so will still pose an issue on all other websites.
Is there any solution that will not pose a security threat when used on a website that has access restricted to our office?
Thanks in advance.
Edit: A brief background
Here's a brief description of exactly what I'm trying to achieve. The company I work for develops websites for tourism related businesses, amongst other things. At the moment when creating a new website I would need to setup a hosting package which includes: creating the directory structure for the new site, creating an apache config file which is included into httpd.conf, adding a new FTP user, creating a new database for use with the website CMS to name a few.
At the moment I have a bash script on the server which creates the directory structure, adds user, creates apache config file and gracefully restarts apache. That's just one part, what I'm looking to do is use this shell script in a PHP script to automate the entire website generation process in an easy to use way, for other colleagues and just general efficiency.
You have at least 4 options:
Add the apache user to the sudoers file (and restrict it to run the one command!)
In this case some security hole in your php-apps may run the script too (if they can include the calling php for example - or even bypass the restriction to your ip by using another url that also calls the script, mod_rewrite)
Flag the script with the s bit
Dangerous, don't do it.
Run another web server that only binds to a local interface and is not accessible from outside
This is my prefered solution, since the link calling the php is accessible by links from your main webserver and the security can be handled seperately. You can even create a new user for this server. Some simple server does the job, there are server modules for python and perl for example. It is not even necessary, that you enable exec in your php installation at all!
Run a daemon (inotify for example, to watch file events) or cronjob that reads some file or db-entry and then runs the command
This may be too complex and has the disadvantage, that the daemon can not check which script has generated the entry.

Inter-network File Transfers using PHP with polling

I am designing a web-based file-managment system that can be conceptualised as 3 different servers:
The server that hosts the system interface (built in PHP) where users 'upload' and manage files (no actual files are stored here, it's all meta).
A separate staging server where files are placed to be worked on.
A file-store where the files are stored when they are not being worked on.
All 3 servers will be *nix-based on the same internal network. Users, based in Windows, will use a web interface to create an initial entry for a file on Server 1. This file will be 'uploaded' to Server 3 either from the user's local drive (if the file doesn't currently exist anywhere on the network) or another network drive on the internal network.
My question relates to the best programmatic approach to achieve what I want to do, namely:
When a user uploads a file (selecting the source via a web form) from the network, the file is transferred to Server 3 as an inter-network transfer, rather than passing through the user (which I believe is what would happen if it was sent as a standard HTTP form upload). I know I could set up FTP servers on each machine and attempt to FXP files between locations, but is this preferable to PHP executing a command on Server 1 (which will have global network access), to perform a cross-network transfer that way?
The second problem is that these are very large files we're talking about, at least a gigabyte or two each, and so transfers will not be instant. I need some method of polling the status of the transfer, and returning this to the web interface so that the user knows what is going on.
Alternatively this upload could be left to run asyncrhonously to the user's current view, but I would still need a method to check the status of the transfer to ensure it completes.
So, if using an FXP solution, how could polling be achieved? If using a file move/copy command from the shell, is any form of polling possible? PHP/JQuery solutions would be very acceptable.
My final part to this question relates to windows network drive mapping. A user may map a drive (and select a file from), an arbitrarily specified mapped drive. Their G:\ may relate to \server4\some\location\therein, but presumably any drive path given to the server via a web form will only send the G:\ file path. Is there a way to determine the 'real path' of mapped network drives?
Any solution would be used to stage files from Server 3 to Server 2 when the files are being worked on - the emphasis being on these giant files not having to pass through the user's local machine first.
Please let me know if you have comments and I will try to make this question more coherant if it is unclear.
As far as I’m aware (and I could be wrong) there is no standard way to determine the UNC path of a mapped drive from a browser.
The only way to do this would be to have some kind of control within the web page. Could be ActiveX or maybe flash. I’ve seen ActiveX doing this, but not flash.
In the past when designing web based systems that need to know the UNC path of a user’s mapped drive I’ve had to have a translation of drive to UNC path stored server side. I did have a luxury though of knowing which drive would map to what UNC path. If the user can set arbitrary paths then this obviously won’t work.
Ok, as I’m procrastinating and avoiding real work I’ve given this some thought.
I’ll preface this by saying that I’m in no way a Linux expert and the system I’m about to describe has just been thought up off the top of my head and is not something you’d want to put into any kind of production. However, it might help you down the right path.
So, you have 3 servers, the Interface Server (LAMP stack I’m assuming?) your Staging Server and your File Store Server. You will also have Client Machines and Network Shares. For the purpose of this design your Network Shares are hosted on nix boxes that your File Store can scp from.
You’d create your frontend website that tracks and stores information about files etc. This will also hold the details about which files are being copied, which are in Staging and so on.
You’ll also need some kind of Service running on the File Store Server. I’ll call this the File Copy Service. This will be responsible for coping the files from your servers hosting the network shares.
Now, you’ve still got an issue with how you figure out what path the users file is actually on. If you can stop users from mapping their own drives and force them to use consistent drive letters then you could keep a translation of drive letter to UNC path on the server. If you can’t, well I’ll let you figure that out. If you’re in a windows domain you can force the drive mappings using Group Policies.
Anyway, the process for the system would work something like this.
User goes to system and selects a file
The Interface server take the file path and calls the File Copy Service on the File Store Server
The File Copy Service connects to the server that hosts the file and initiates the copy. If they’re all nix boxes you could easily use something like SCP. Now, I haven’t actually looked up how to do it but I’d be very surprised if you can’t get a running total of percentage complete from SCP as it’s copying. With this running total the File Copy Service will be updating the database on the Interface Server with how the copy is doing so the user can see this from the Interface Server.
The File Copy Service can also be used to move files from the File Store to the staging server.
As i said very roughly thought out. The above would work, but it all depends a lot on how your systems are set up etc.
Having said all that though, there must be software that would do this out there. Have you looked?
If iam right is this archtecture:
Entlarge image
1.)
First lets sove the issue of "inter server transfer"
I would solve this issue by mount the FileSystem from Server 2 and 3 to Server 1 by NFS.
https://help.ubuntu.com/8.04/serverguide/network-file-system.html
So PHP can direct store files on file system and dont need to know on which server the files realy is.
/etc/exports
of Server 2 + 3
/directory/with/files 192.168.IPofServer.1 (rw,sync)
exportfs -ra
/etc/fstab
of Server 1
192.168.IPofServer.2:/var/lib/data/server2/ /directory/with/files nfs rsize=8192,wsize=8192,timeo=14,intr
192.168.IPofServer.3:/var/lib/data/server3/ /directory/with/files nfs rsize=8192,wsize=8192,timeo=14,intr
mount -a
2.)
Get upload progress for realy large files,
here are some possibilitys to have a progress bar for http uploads.
But for a resume function you would have to use a flash plugin.
http://fineuploader.com/#demo
https://github.com/valums/file-uploader
or you can build it by your selfe using the apc extension
http://www.amwsites.com/blog/2011/01/use-a-combination-of-jquery-php-apc-uploadprogress-to-show-progress-bar-during-an-upload/
3.)
Lets Server load files from Network drive.
This i would try with a java applet to figurre out the real network path and send this to server, so the server can fetch the file in background.
But i never didt thinks like this before and have no further informations.

Is it possible to create ftp users and assign them access to select folders using php?

I just needed to know that is it possible in php to create an ftp user, and then create folders on the server and grant ftp access to selected folders for the ftp user created.
Thanks again!
Native PHP can not do this. The task is way out of PHP's scope.
Depending on the server OS and FTP server software used, however, PHP could call some shell scripts (or WMI / PowerShell scripts on Windows) that accomplish the task. This is not trivial to set up, though, especially not if it's to be done safely (without giving the PHP process root level privileges).
The question may be better suited on Serverfault.com.
There are a few web hosting panels written in PHP that crate ftp accounts among other things so it's definitely possible.
The exact procedure depends completely on the FTP server you use. It may involve creating new Unix user accounts.
This is more an FTP or operating system question than a PHP question though as you need to shell out to do the configuration. As Pekka said you may have more luck asking on Serverfault if you include the details of your setup.
No but if I'm not mistaking you could do something like this
Create a shell script (ftp.sh) that's has SUID (make sure it's owned by root and only can be read/written by root) that creates users, sets the permissions, etc
Call the script from php
system("./ftp.sh ".escapeshellarg($newUsername)." ".escapeshellarg($newPassword))
However I'm pretty sure there are more secure/correct ways of doing this. I can definitely see this becoming a security nightmare.
The answer is "Yes" if the web process where the script runs allows changes on the FTP settings e.g adding users, group etc. either by native PHP function or additional "Shell script" and it would be "No" if the web process doesn't have access nor privilege to make changes.

Categories