How to compare different files from 2 servers with rsync? [closed] - php

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
I got trouble with some hackers, they hacked into our website, so I cloned our server to a new one, and then reset the github on that server.
I heard about rsync can find out what is different about 2 servers. Is it possible for rsync ? Can I export that list of files to a text file?
Thanks your help, got hacked by use old version of wordpress >_< I was crazy with this from last couple days.

The comment by Marty is good - the rsync command as written will do a dry-run (-n) to show you what files were added/deleted/changed between the $TARGET and $SOURCE locations so you can then inspect or diff them to see if there is any malicious code.
Additionally, in the past when I've dealt with hacked WordPress installs, it is important to find the exploitation vector. Often times PHP shells get uploaded via some insecure script or plugin which gives the attacker a command based web shell to view files and run commands etc.
To find these files, the following command is helpful:
grep -E '(?:(shell_)?exec|system|eval)' /path/to/wordpress/* -R
This might yield something like:
wpte.php: eval($_POST['p1']);
Which in this case, wpte.php was a malicious PHP shell script that got uploaded to one of my client's servers that someone then used to run commands and upload more files. These scripts usually use one or more functions like eval or shell_exec to run commands input from the web shell.
Going from there, you can check the server access logs for hits to the malicious script and then further search the logs for the IP address(es) that accessed that script to potentially find how the uploaded the script or other hits resources they accessed.

Related

how to run exe() from browser [closed]

Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 4 years ago.
Improve this question
Hello I'm new to both PHP and python but I'm using them both in my graduation project I have an OTP code to generate a random number and send SMS to phone numbers and another code that runs a fingerprint scanner.both are working fine when I use exec() in PHP to run them in cmd but when using browser only the otp code works with the webpage, I've changed the permission of the apache user and gave it root privileges and tried both system() and exec() functions but still nothing shows up when using trying to execute the fingerprint code from browser
I would appreciate any help because I am stuck on this for many hours days now
I've changed the permission of the apache user and gave it root privileges
OMG no.
If your supervisor (whom is being paid to give you advice) thinks this a good idea, then find another supervisor.
You have not provided nearly enough information to form an opinion on the cause. You need to investigate which file permissions are relevant to the problem - that is Unix filesystem privileges on executables, devices and data files, but you also need to look at any mandatory access control systems which might be in play (SELinux, Apparmor, smack). You should also try running the programs from an interactive shell as the Apache uid. Note that it is usually a good idea to severely restricted the programs the webserver can run and whitelist specific actions for the webserver/webserver uid via sudo.

How to copy directory from a public http url to my Server [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
I have some files in directory and sub directory in an open HTTP site
For Example:
http://example.com/directory/file1
http://example.com/directory/file2
http://example.com/directory/sub-directory/file1
http://example.com/directory/sub-directory/file2
http://example.com/directory/sub-directory2/file1
http://example.com/directory/sub-directory2/file2
I want to copy the full directory to my server.
I don't have SSH or FTP access to the http://example.com
I have tried transloader script which grabs only one file every time.
I need to copy the full directory exactly as is on the HTTP server to my new server.
Thanks
Use wget or curl:
wget -r --no-parent mysite.com
You are unable to do this. You can grab the content of the visual layer/GUI that the site provides to you, but you can not grab any of the "behind the scenes" pages which the site has. You wont be able to get any of the site which is doing the back end processing to create what you see on the front end.
The only way to do do this is if you have access to the directories on the site. By this, I mean when you go to the base directory, such as example.com/test/, it just gives a list of all possible files in that directory. As it stands though, most sites protect against it, therefore unless you have direct access, this is not doable as it would be entirely insecure and would create many headaches for development and privacy.

Upgrading PHP to latest version [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
I've been using PHP-5.4.3 for about a month now, and today, I decided to make a switch to PHP-5.5.5, I downloaded the source code, and placed it in C:/php (also renaming the folder php-5.5.5 to php) and I added server variables, as usual C:/php/ <-- but here I got stuck. Because, usually I appended the php.exe at the end, which was found inside the PHP folder so, I could be able to access PHP from the command line, or start the built-in server, but now,I can't find this file, and I can't find a way to start the server from the command line either.
You can compile php yourself. It's simply 2-4 commands and its also explained. I can run configure then make install. You can run configure --help to see all the options. It can be useful if you compile for fast-cgi or need more space or a faster php.
The source code you downloaded needs to be built...
You can download a binary version from here (assuming you're using Windows, of course).
Or read this if you want to build it yourself. There are many other tutorials online.

Malware infected website [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I have the following problem:
The website I made for a friend has been infected with malware. When I tried to clean it, by replacing files that I found to be different to the ones I uploaded, with my original files, after a short time, the files were different again. The file permissions are all 644, and the folders 755. It is as if the one who infected the files has access to change them whenever I change them back. Can anyone help me since I am very new to this kind of problems?
First things first: report this to your webhost immediately! Secondly change all of your relevant passwords!
That being done, there are a few possible causes:
Your parent webhost has been compromised, in which case there is nothing you can do except move to a better host.
Your website contains a vulnerability that is being picked-up by kiddies with their vuln-scanners. Be sure to audit your code to ensure that no user action can result in your website's filesystem being touched inappropriately; also check for SQL injection avenues.
Your website uses a widely-distributed application, such as WordPress, that has not been patched - this is a major problem.
Your own PC has been compromised and ne'erdowells have used a keylogger or other software to discover your FTP or SSH account details, and are abusing your website. Run a local scan and audit everything to ensure your bank account is being raided either.
This isn't a code-related problem. This isn't the place for your question.
But: It's likely that a program is running on your server and re-infecting the files. I'd recommend either taking it to a professional malware removal service, or (my preference) burning the server in a fire and allowing a new server to rise from the ashes. Then install an AV suite on the new server.

Linux server backup with PHP? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
I manage a VPS of my client. He wanted a backup solution includes some folders and mysql databases. OS is Ubuntu, web server is apache. I don't want my client to mess with ssh or ftp.
I think i can save database backup files and trigger PHP's exec function from a wab page to zip folders and database backup files, then give a link to download this zip file.
This is technically possible but i wonder if there is a better solution except automatically copying backup files to another server. Because creating backups anytime is required in my situation.
There are lots of possibilities, here is a very simple one we use every day:
create a backup script (e.g. in bash) with the usual suspects as mysqldump, tar and date
make sure, this backup script locks against double runs
create a cron job, that runs every minute, checks if a flagfile exists, and if yes starts the backup script and then clears the flagfile
if you want, create more cron jobs (e.g. a daily one), that do nothing but set the flagfile
create a trivial PHP script, that just touches the flagfile to trigger an adhoc backup
You can download the finished backup package, once the flagfile is cleared (again check via a trivial PHP script)

Categories