Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
I have a management system built in PHP that will run locally on multiple computers. I want to hide the source code from the user of the system. I do not want to share the code but only the application.
The user can view the application, use it and can also make changes but I don't want the user to get access to the PHP files or any other project files as they are placed in htdocs (Xampp) and www (wamp) folder.
I have searched so far and I couldn't find out but only some decoders:
ionCube
Zend Guard
PHP Obfuscator
I would suggest mounting this on a local webserver instead and asking the users to access this over a network connection in thier browser rather than directly from localhost. This way, the code will be isolated on the webserver, and as long as you don't "share" the root folder on the network, they won't be able to see the source files in any way.
If you must encrypt the code, then you've already seen the 2 most popular ways of encoding. Xampp can be configured to use IonCube and ZendGuard. Just ensure that you encode it in a way that can be decoded on the client machine (you can encode to differing PHP versions)
As #Edmondscommerce stated, I think the most viable option is to host it externally. This could be in a local network so that you don't have to host it online. There are many disadvantages to run and store your files locally (each client will have to run a webserver, updates will be disastrous and you, rightly so, have security concerns).
That said, if it has to run locally, there are still some ways to hide the contents of the files depending on the user(s) of the system. If the user is not an admin and you do have those rights, you could set the the server files to inaccessible using the local OS' methods (I.e. revoke read and write rights). Be sure that the webserver will have to be given at least read access to the files in order to serve them to the local user.
you can put your source code in some where hidden in the host and change the server root path form "C:\xampp\htdocs" to your new one.
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
I recently started working in a company and one of the first tasks that I was given was to redo their website.
Being a newbie and after a few trial and errors on sites like wix.com, weebly.com, I made a WordPress website with a responsive theme using a WAMP server.
The site seems all fine but when the time came to finally put it on the server, I came to know that the hosting company only supports HTML based website whereas the website I built is a PHP one. The following are my questions
How do I go about publishing my WordPress site on an HTML supported only server? Is there a way to convert the website or any other method? If yes, please explain in detail as I am a newbie.
I was somehow able to export my database from the localhost MySQL server to that of the server where I want my site to be but does the web hosting company only supporting HTML based website affect the database in MySQL server? If yes, what should be done and how? Kindly explain in detail.
The cPanel of the webhost is pretty basic and on calling the company, I was told that all I had to do was drag and drop my files there for my site to go online but that dint work.
I tried changing the extensions from PHP to HTML and uploading the files, but all I get is a blank screen on going on going to www.mycompanydomain.ae.
I even tried the whole adding a line in the .htaccess file to open as an HTML without actually changing the all the PHP files but to no avail.
Kindly help as I have spent a lot of time and energy on this but now I am at a roadblock.
You could browse through each page that makes up your website and use your browser to "save" a local copy of the page and upload these. A an easier way to accomplish this foolish task would be to use a web crawling tool like WebReaper on your local website and upload the results to the HTML only host.
The caveats to doing this are:
Your site is no longer interactive, everything is static.
Nice folder structure goes out the window and everything is a mess
It's obviously a bad idea
Don't do this, it's a bad idea
No, seriously.
The correct solution, if you need to have anything server-side/database interactive, is switch to a host that has PHP enabled. You would also want to use a tool like PHPMyAdmin to export your local database and import it on the new host.
You can't do it in HTML. But you can use iframe and wordpress.com or host it elsewhere...
The plugin Really Static/ claims to generate HTML files each time you update your WordPress blog: "saving static files via local, FTP, SFTP" and "
if you don't have PHP/MySQL support on your server you can host your WordPress installation locally and use a normal HTML webspace for publishing"
I hope this helps!
No, there's no practical way to transform your entire site into HTML.
Switch to a web host that supports PHP.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
I just discovered that every website in my hosting account (which is a shared hosting) is infected with malicious code.
The malicious code is a < script> tag appended after the < /html> tag. It redirects to a russian website.
The problem is this: my PHP files are not compromissed. I downloaded them via FTP, and they are fine. The "last modified" dates are fine too (some files are from 2012). Even if I upload a brand new PHP file, when I access it through the web, it's infected. But if I download it again via FTP, it´s fine.
It's like some .htaccess rule is appending the malicious code to all my PHP pages AFTER they pass through the PHP engine, or something like that (but my .htaccess files are fine too).
What could be the problem? Is the hosting provider compromised, or is it my account? What can I do to solve this problem? Google is already sending me malware notifications, and the support guys are slow as hell.
Thanks for your time, and please forgive my poor english.
Edit: Adding < ?php exit() ?> to the end of any PHP file stops the infection, so this seems to be a PHP problem.
Yeah, wouldn't hurt to change your ftp password, also look for files that aren't yours or part of the installation. I've had issue like that before and there were scripts in the images directory that I didn't put there. I removed them. Change any files so that they aren't world writeable. e.g. change from 666 to 644. Do the same to directories, 777 to 755. If the files and directories are owned by the ftp user the lesser permissions should be fine.
Then maybe try this to clean up or get your host to do it if you don't have access.
http://cachecrew.com/fixing-an-infected-php-web-server/
The first thing I would do, is change the FTP password, and run an anti-virus on the computer you are using to access the FTP.
Something like this has happened to me before, and the point of intrusion on the web server was trough the stealing of the FTP user and password trough a trojan.
About the PHP files, and in the start of the file, aren't their any strange tag?
In my case, the files were updated on the start on the file, and it wasn't only PHP files, HTML files were also affected.
The script tag that appear after the tag, could be inserted their trough Javascript. And don't rely on the date modified ( check this answer )
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
Ok, so my server got hacked last week. The hacker attacked an out-of-date 3rd party editor (in php) and implanted a backdoor script in PHP and did serious damages to my sites. I spent a whole weekend cleaning the backdoor scripts and any malicious codes he left on my server. And in order to avoid being hacked again, I did the following to my server:
turn off file_upload in PHP. Since the hacked uploaded the backdoor through PHP, I disabled the function in php.ini. So now I can only upload through ftp.
disable create_function in php. None of my software uses this function. The hacker used this function the same way as eval() to execute commands in as strings.
disable popen,exec,system,passthru,proc_open,shell_exec,show_source,phpinfo in php.ini. These functions are primarily used by the back door script to modify my files and directories.
Install suhosin. Find legal functions that are called within eval(), put them in the suhosin.executor.eval.whitelist. Hacker put malicious codes in my program, and obfiscated it with base64_encode, and then execute them within eval(). So I only allow a couple of legal functions being called within eval().
Turn on suhosin.executor.disable_emodifier. The hacker put another obfiscated code in my program, and used preg_replace() e modifier to execute whatever php commands he put on his browser. So he could upload or modify any files on the server through his browser. (Since I turned off file_upload, he could not upload any more, but still he could modify and delete files as he wanted).
By disabling the create_function, preg_replace() e modifier and limiting eval(), even there are malicious codes left uncleaned on my server, the hacker could not do anything. These are the 3 most dangerous functions in PHP.
Add .htaccess to every folder but the root directory and forbidding PHP from being executed directly from browser:
Order Deny,Allow
Deny from all
I put another * after Php because I found a backdoor file that was named as missing.php.DISABLED and this can still be executed if I do not put * after php
Set the root directory (the only place that allows to execute .php) as read only. Set all files in that folder read-only. So the hacker could not upload any new back door script to the only directory where the php can be executed. Neither could he modify the files in that directory.
For the wordpress login, I added
Order Deny,Allow
Deny from all
allow from xxx.xxx.xxx.xxx
to the .htaccess in the root directory, where xxx.xxx.xxx.xxx is my ip.
Set all .htaccess read only.
Well, this is what I can do to strenthen the security of my server. Did I miss anything?
Thank you for your advice.
Unless you reimaged the machine from known-clean install media, you can't know there isn't a lingering rootkit.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
My Websites hosted on different servers being hacked again and again with same base64 malware codes. When I decoded the base64 code I got the link to mbrowserstats.com/statH/stat.php.
Please note: My websites with core php and also wordpress are being hacked. They are placing base64 malware codes in following files - index.php, main.php, footer.php, template files of wordpress (index.php, main.php, footer.php), index.php files in wp-admin, plugins, themes folders etc.
I have already tried below things but all websites are being hacked again and again.
Changed all ftp passwords
Changed ftp client fileZilla to winSCP
Removed all malware codes and re-upload all files to server
Uploaded old backup files without malware codes
Disabled magic_quotes_gpc, register_globals, also exec & shell_exec functions
Used index files to prevent direct folder access
Used mysql_real_escape_string function to sanitize data for insert queries in php websites
Updated WordPress and also all Plugins to latest version
Installed malwarebytes anti-malware and scanned my computer for malwares (Full Scan)
Confirmed that my websites are not using timthumb.php file
Changed file permissions (755 for folders & 644 for files). Now only image upload folders have 777 permission.
When I checked the websites' visitor details I found some IPs like 150.70.172.111 / 150.70.172.202, Hostname:150-70-172-111.trendmicro.com, Country - Japan. They accessed websites in close times to the time that of modified files (malware injected files).
Additional Information: I'm using Trend Micro antivirus from last 1 year. I'm wondering that the IPs with hostname 'trendmicro.com' have any relation with hacking or in stealing my ftp passwords.
I suspect that they are using ftp access to insert malware codes. Also the time between file modifications is very low. They have updated all files within seconds. So I think they are using a program for that. Manually they cannot edit all files within seconds as I have so many files in different folders of same website.
Please help me to resolve this issue. I have tried many things but it happens again. Thanks
It's tricky to handle this. One of the common ways this happens is that on a shared server a malicious user can use another account and insert a file in your upload directory (which is often world writeable on shared servers) by going down and back up the filesystem. It's not really an issue of passwords being cracked. Things you can do:
Use a private/virtual server- just not the standard shared type with more than one user in the same filesytem
Keep WordPress updated
Check all your theme and plugins for online notices of vulnerabilities. A big one is that many themes use timthumb.php for image resize which had a big security hole last year. You can continue using it but make sure to replace it with the current version.
For hosting I highly recommend using something such as http://WPEngine.com as you will not only get a private experience but they will also be more top of security scans than standard hosting companies.
Also if your site has been hacked you must be very very careful to remove all backdoors - I recommend doing a clean install which is obviously tough since you have to put your theme back and that can contain backdoors as well. Malicious users will create multiple backdoors in case one gets taken down. There are a few scripts online that will scan for these but none that is perfect. Making a cleab install, then backing it up offline in case of a hack is a good option.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I've always debated this in my head and now would like some input from you guys at stack. So what is faster?
I can see that grabbing images from files are probably the fastest since it's local, but the process of finding the files through folders and picking the right one would probably use up the most processing power.
Grabbing a image from url could simply be sending a request to the url and downloading that image. While the image is downloading, other parts of your website is loading.
When loading a page, how does the server run one (or few) processing threads to build the page? Does a page get built in a procedural fashion (building one thing at a time as apposed to running everything at the same time)? Could this be the difference of procedural PHP (Wordpress) and object oriented PHP (Codeigniter)?
When you get file via url you need to connect to server. Now you have two cases:
Server is local
Server is external
If server is local then you may use local IP which won't cause DNS to resolve adress and it's pretty fast but server is involved.
If the server is extarnal then you need to use either domain or ip if you know it. You need to calculate the speed of connection and speed of server but in my personal opinion this is not good solution.
About using files. You wrote that you have URL which exactly defines where the file is. You can do the same with files and give the path so there's no need to find the file just to download it. I'm certain it's faster solution.
About Wordpress and Codeigniter it's still PHP so it depends how the code is used. Obviosly you can write stupid function that looks in entire server to find a file or you can specify where it should be or you can give a path to it. So it's faster. There are also nice solutions in PHP to search for files and handle them. For example iterators or simple glob() function.
To conclude, my opinion is that using files instead urls is better solution.
The way it works is,
a) The HTML document (static / the one emitted from PHP) gets downloaded from the server to the browser.
b) The browser will start parsing it.
c) It parses each and every tag and renders / controls(i.e., JavaScript) accordingly.
If there are any resources that needs to be loaded, browser makes an additional request to download that resource.
Any request that is sent over the network, there would be a delay.
There are ways how you could optimize it. Few such tips are given below that includes reducing DNS look up too.
http://developer.yahoo.com/performance/rules.html
It is always better to use CSS Image Sprites, HTML5 local storage if the files are not getting changed very often.