How to find out which code is using Sendmail? - php

We are having some unknown code (virus?) sending out thousands of mails from our server suddenly and though we think we have removed the corresponding malicious PHP file, the mails are still getting sent out.
How can I find out which code is sending out the mails? I tried looking under /var/log/maillog but no pointers there. Any other way of finding it out?
We are using CentOS distro.

The fact that you have removed the corresponding PHP file doesn't mean that that file didn't manage to make copies of itself elsewhere on your system. If you say that these emails are being sent continuously, ie. this was not a single occurance, then it is possible that the script has somehow infiltrated your crontab files and is calling itself periodically.
Take a look at your crontab file for each user (including root) on your system. Make sure to inspect any script the crontab is executing no matter how innocent it looks.
Another option would be an .htaccess file executing a certain script when presented with a specific URL. One could easily hide the execution of a script in this way. Inspect all .htaccess files for strange rules that you have no record of...
Hopefully one or more of these options will shed some light on where these emails are being sent from...

The malicious code was in Wordpress DB and associated PHP files, every time site was getting loaded - it was called with call to header PHP. We cleaned the site and fresh install fixed the problem. I had already checked crontab and there was no infected code there. Thanks for all the pointers.

Related

PHP - Multiple scripts at once (AJAX)

After asking this question, someone pointed on the right direction of not being able to execute a second script at all if one was already running.
I usually make apps which rely on the execution of AJAX calls to PHP pages, and today I found that trying to write on a file with fwrite() on a PHP script and trying to read that same file with fread() (to get progress feedback) on another AJAX call ended up in the second script only being executed when the first one had already finished.
Even trying to echo a simple "hello" (echo "hello"; exit;) would not show nothing on the page until the first script was finished.
So, I'm asking: is this a normal configuration? Is this the same on every installation of PHP by default? Is some configuration on php.ini that I can change?
Or it has to do with the server (in my case, Microsoft IIS 10)? Can someone shed some light on how to be able to execute multiple PHP scripts on different AJAX calls at once (or before the others finish)?
I know I'm not giving much information about the settings of my context, but I don't know neither where to look into.
Thank you everyone for your time and help!
As Luis said it could be a write-lock on the file that you're trying to modify. However another possibility if you're using sessions that use files (rather than a database), or a framework that uses file-based-sessions - then this behavior could also be a result of session-locking. My money would be on Luis' answer though - you should probably be using a database rather than a file unless you have a solid reason not to.

Routing .htaccess to GitHub

I was wondering if there was a way to basically host a site on your server so you can run PHP, but have the actual code hosted on GitHub. In other words...
If a HTTP request went to:
http://mysite.com/docs.html
It'd request and pull in the content (via file_get_contents() or something):
https://raw.github.com/OscarGodson/Core.js/master/docs.html
Or, if they went to:
http://mysite.com/somedir/another/core.js
It'd pull down:
https://raw.github.com/OscarGodson/Core.js/master/somedir/another/core.js
I know GitHub has their own DNS servers, but id rather host it on my so i can run server side code. What would the htaccess code look like for this?
This is beyond the capabilities of .htaccess files, if the requirement is to run the PHP embedded in the HTML stored on github.com at the server on yourserver.com simply by a configuration line like a redirect in the .htaccess file.
A .htaccess file is typically used to provide directives to the Apache web server. These directives can indicate, for example, access permissions, popup password protection, linkages between URLs and the server's file system, handlers for certain types of files when fetched by the server before delivery to the browser, and redirects from one URL to another URL.
An .htaccess file can issue redirects for http://mysite.com/somedir/another/core.js to https://raw.github.com.... but then the browser will be pointed to raw.github.com, not mysite.com. Tricks can be done with frames to make this redirection less transparent to the human at the browser... but these dont affect the fact that the data comes from github.com without ever going to the server at mysite.com
In particular, PHP tags embedded in the HTML on github.com are never received by mysite.com's server and therefore will not run. Probably not want you want. Unless some big changes have occurred in Apache, .htaccess files will not set up that workflow. It might be possible for some expert to write an apache module to do it, but I am not sure.
What you can do is put a cron job on mysite.com that git pull's from github.com every few minutes. Perhaps that is what you want to do instead?
If the server can run PHP code, you can do this.
Basically, in the .htaccess file you use a RewriteRule to send all paths to a PHP script on your server. For example, a request for /somedir/anotherdir/core.js becomes /my-script.php/somedir/anotherdir/core.js. This is how a lot of app frameworks operate. When my-script.php runs the "real" path is in the PATH_INFO variable.
From that point the script could then fetch the file from GitHub. If it was HTML or JavaScript or an image, it could just pass it along to the client. (To do things properly, though, you'll want to pass along all the right headers, too, like ETag and Last-Modified and then also check those files, so that caching works properly and you don't spend a lot of time transferring files that don't need to be transferred again and again. Otherwise your site will be really slow.)
If the file is a PHP file, you could download it locally, then include it into the script in order to execute it. In this case, though, you need to make sure that every PHP file is self-contained, because you don't know which files have been fetched from GitHub yet, so if one file includes another you need to make sure the files dependent on the first file are downloaded, too. And the files dependent on those files, also.
So, in short, the .htaccess part of this is really simple, it's just a single RewriteRule. The complexity is in the PHP script that fetches files from GitHub. And if you just do the simplest thing possible, your site might not work, or it will work but really painfully slowly. And if you do a ton of genius level work on that script, you could make it run OK.
Now, what is the goal here? To save yourself the trouble of logging into the server and typing git pull to update the server files? I hope I've convinced you that trying to fetch files on demand from GitHub will be even more trouble than that.

Using php and MPI

I currently have a php file which allows the user to upload a file. Once they upload the file, it runs a program with the file using MPI.
The problem is that the script says it cannot find the file .mpd.conf (config file that must be present in users home directory). I'm guessing that this is because it is running as a different user than myself.
I am using apache2 to serve this webpage, can anyone help me get this working? I don't know too much about how PHP works.
Although the user can set a lot of things in their .mpd.conf, the reason it's required is just to have a `secret word' that the launched mpds can agree on -- like (say) erlang machine cookies, it's just so that the various mpd daemons launched only can make sure they're only contacting the right other mpds.
Presumably your php program is launching a script which does the mpirun/mpiexec? If so, you could simply have the script check for the existance of ~/.mpd.conf and if it doesn't exist, create it containing a line of the form MPD_SECRETWORD=[something-unique-here] and then make sure its created with read/write permissions only for that user.

All PHP files getting hacked

Like always, just want to say thank you for all of the help and input in advance.
I have a particular site that I am the web developer for and am running into a unique problem. It seems that somehow something is getting into every single PHP file on my site and adding some malware code. I have deleted the code from every page multiple times and changed FTP and DB passwords, but to no avail.
The code that is added looks like this - eval(base64_decode(string)) - which the string is 3024 characters.
Not sure if anyone else has ran into this problem or if any one has ideas on how I can secure my php code up.
Thanks again.
The server itself could be compromised. Report the problem to your web host. What is their response?
An insecure PHP script coupled with incorrect file permissions could give the attacker the ability to modify your PHP files. To eliminate this possibility I would take the site down, delete all the files, re-upload, then switch permissions on the entire site to deny any writes to the file system.
Edit:
As a short-term fix try asking your web host to disable eval() for your account. If they're worth their salt they should be running Suhosin which has an option to disable eval.
You should use "disable_functions=eval,exec" in your php.ini or .htaccess as first measure.
yes i have ran into this problem myself, i take it you are on a shared host? are you perchance on rackspacecloud?
this is where i had that problem, the first thing you need to do right away is notify your host, this is a hosting issue, and i suspect the malware has gained access to your server on an ftp level.
make sure you have nothing chmod 777 world writable, if it needs to be writable by your app make it 775
hope this helps, good luck
You should change the file permissions so that only you can write to those files. 0777 (the default on some hosts, I believe) is just asking for trouble. See File Permissions.
Also, it's advisable to not put any files that aren't supposed to be accessible by URL outside of the public_html folder, for example, config files.
I had a similar problem. However, my problem was that I was running a python code evaluator on my site. As far as I remember you need to use eval() function to execute the python code. In one of my php files I had a weird eval statement. What kind of script are you developing? I mean does it involve evaluation of some other code?
You should also note that (assuming you are using a hosting solution to host your site) that it's almost never your fault. An example being that networksolutions hosting company recently had a server hacked and over 1K webpages were affected, not due to security holes on each particular site, but due to some bad configuration/monitering of what was put on that particular server that hosts those sites. If you can't see any thing security wise wrong with your code, aka you sanitize everything properly and or you are running a non vulnerable version of whatever CMS you are using (if your using a CMS) then it's probably not an issue with your site, just the server in general.
You should move to another server. It would appear that the attacker has access to the server or is running some code as a background process which is overwriting the files. It may be possible to identify and remove the problem, but smart attackers will hide additional scripts etc to trip you up later.
I've come across viruses that read filezilla conf files.
I SWEAR TO GOD. at first i was: WOW, then i was: mother f*** sneaky b*stards.
Check your pc for viruses.
One of the possible scenarios is that somebody managed to get write access somehow and changing passwords etc. helped, but he left a php file that can still run.
See if there are any unknown files there. Or delete every damn thing and restore some backups.
Get the last modified time of your files, then go over to your access logs (FTP, HTTP whatever's open, if you don't know where they are ask your host) and find out who was mucking around on your system at that time.
Likely the attacker has installed a script that they can call periodically to re-infect any files you fix.

I'm trying to run some PHP scripts as CLI instead of over HTTP. How do I make them play nice?

I'm using some PHP scripts from FeedForAll to join together RSS feeds (RSSmesh) and display them as HTML (RSS2HTML).
Because I intend to run these scripts fairly intensively and don't want the resulting HTTP requests and bandwidth to count towards my hosting quota, I am in the process of moving to running them on the web host's server in an umbrella PHP "batch" script, and call this script via cron (this is a Linux server, by the way).
Here's a (working) sample request over HTTP:
http://www.mydomain.com/a/rss2htmlcore/rss2html2.php?XMLFILE=http://www.mydomain.com/a/myapp/xmlcache/feed.xml&TEMPLATE=template.html
This will produce the desired HTML output. An example of how I want this to work on the command line:
/srv/customers/mycustomer#/mydomain.com/www/a/rss2htmlcore/rss2html2-cli.php /srv/customers/mycustomer#/mydomain.com/www/a/myapp/xmlcache/feed.xml /srv/customers/mycustomer#/mydomain.com/www/a/template.html
This is with the correct shebang line added to "rss2html2-cli.php". I could just as well specify the executable ("/usr/local/bin/php") in the request, I doubt it makes a difference because I am able to run another script (that I wrote myself) either way without problems.
Now, RSS2HTML and RSSmesh are different in that, for starters, they include secondary files -- for example, both include an XML parser script -- and I suspect that this is where I am getting a bit in over my head.
Right now I'm calling exec() from the "umbrella" batch script, like so:
exec("/srv/customers/mycustomer#/mydomain.com/www/a/rss2htmlcore/rss2html2-cli.php /srv/customers/mycustomer#/mydomain.com/www/a/myapp/xmlcache/feed.xml /srv/customers/mycustomer#/mydomain.com/www/a/template.html", $output)
But no output is being produced. What's the best way to go about this and what "gotchas" should I keep in mind? Is exec() the right way to approach this? It works fine for the other (simple) script but that writes its own output. For this I want to get the output and write it to a file from within the umbrella script if possible. I've also tried output buffering but to no avail.
Do I need to pay attention to anything specific with regard to the includes? Right now they're specified in the scripts as include_once("FeedForAll_XMLParser.inc.php"); and the specified files are indeed in the same folder.
Further info:
-This is a Linux server.
-I have no direct access to the shell, so I can't test things directly on a command line, everything is via crontab.
-I will admit that support for the FeedForAll scripts leaves a lot to be desired, but I'd like to keep using their scripts if at all possible, if only because I know them and have been using them for a while. I have looked into Simplepie, but the FFA scripts do some things that I've seen no obvious solutions for with Simplepie, like limiting the number of items per individual feed (RSSmesh) or limiting the description length (RSS2HTML).
-Yahoo! Pipes is out, they cache their data for too long for my application.
Should you want to take a look at the code, here are the scripts as txt files. RSS2HTML2 and RSSmesh are the FeedForAll scripts, FeedForAll_XMLParser... is the included parser. Note that I have not yet amended these to handle $argv etc. I have however in "scraper-universal-rss-cli", which works fine with CLI.
If anyone has any thoughts to share on this it would be very much appreciated. Thank you in advance.
I think the $hideErrors = 0; line in rss2html is not helping. Since isset is used to check if errors should be displayed you should comment this out. Setting it to zero does nothing since a variable set to 0 still evaluates to true with isset.
Re-run and see if it throws up some errors for you.
Use wget or curl to issue the request against the local web server. Don't use CLI.

Categories