How can I securely allow web users to create files? - php

I'm building a website which allows certain users to write reviews, and I want a small php file to be automatically generated when they do. What's the most secure way to set up accounts/groups/file permissions to allow this? Ideally, I'd like the review writers to be able to change the title in case they make a mistake, which would require php to be able to not only create files and folders, but move and/or remove them, as well. However, that's not an absolute necessity. My test server is running Linux/Apache, the newest versions of everything, and for testing purposes I've temporarily set the owner of the main reviews folder as the server. I'm also open to other suggestions on how to make this happen. I'm not really an IT guy, but I can write shell scripts just fine.
Edit:
Thanks to the selected answer, I was able to come up with a solution. I used this guide (http://www.seomoz.org/ugc/using-mod-rewrite-to-convert-dynamic-urls-to-seo-friendly-urls), and modified it to just load the desired php script with no variables, which I designed to retrieve the information directly from the original URL using $_SERVER['REQUEST_URI']. Here's what my .htaccess file looks like; It sends www.domain.com/reviews/the-review-filepath.php to www.domain.com/reviews/review.php.
RewriteEngine On
RewriteCond %{REQUEST_FILENAME} !-f
RewriteRule !^review\.php$ review.php
It was much easier for me to do it this way because I a lot more about PHP than regular expressions.
Thanks to everyone who answered and/or commented. This is much better than the way I was trying to do it before.

Extending from comment:
If you want to get a "clean" url (e.g. /post/123/comment/456) instead of "parameterized" url (e.g. /?post=123&comment=456), you can still use database, and take advantage of mod_rewrite (since you tagged apache).

Related

htaccess redirection of multiple URLs

I usually use this site to solve my problems, but for the first time I couldn't find a proper question, so forgive me if it actually exists!
Currently, I have valid URLs in such format:
http://www.example.com/index/modules/news/article.php?storyid=15807
That the number at the end is generated dynamically by CMS and therefore changes for any new content published.
In order to use shorter URLs, I want these pages to be accessible by subdomain in such format:
http://news.example.com/n15807
Please help me and let me know if there is a better option rather than using htaccess.
Sure! You can use mod_rewrite in apache's virtual host config file.
Sarcasm aside, the way to reroute a url is to use the tool that accepts the url and does something with it, which in the case of the typical php installation is apache. If PHP were able to accept connections without the use of an http server, then PHP could do it. I suppose one could build a server out of php and run it as a singleton, but I've never heard of such a thing.
So here's the thing, using .htaccess and mod_rewrite isn't so hard, if you understand what you're trying to do. In your case, you want to be able to translate news.example.com/n15807 into that really long uri. That really long uri is what the server will actually load.
mod_rewrite in effect matches against a regular expression and replaces it with another uri. So you would attempt to match something like ^([^w]{3}).example.com/([a-z])([0-9]+)$ and use it to replace values in /index/modules/$1/article.php?storyid=$3 (I have no idea what the n in front of the number is supposed to mean.)
The regex I gave is intended only to point you in the right direction; I haven't tested it. [EDIT - It definitely won't work as written; it's been a while since I worked on mod_rewrite. But the idea is still valid. FWIW, I wouldn't try to use subdomains, though]
Summary: Use the tool that works with the urls. The most accessible way is with .htaccess. Once you figure it out, you're done. You can reuse it over and over in future websites.
After hours of searches I finally got it!
RewriteEngine on
RewriteCond %{HTTP_HOST} ^news.example.com$
RewriteRule ^news/(.*)$ http://www.example.com/index/modules/news/article.php?storyid=$1\#siteBody [R=301,L,NE]
Thought it might help someone someday not to spend as much time as I did!

How to protect my PHP website from the .htaccess URL redirect malware script attack?

How this .htaccess file was injected into the website with malware code?
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteOptions inherit
RewriteCond %{HTTP_REFERER} .*(msn|search|live|altavista|excite|ask|aol|google|mail|bing|yahoo).*$ [NC]
</IfModule>
How can I prevent my website from same attack?
This link was useful till some extent
https://security.stackexchange.com/questions/16361/how-to-prevent-my-website-from-getting-malware-injection-attacks
but my team expects me to protect the website using coding. Is this is possible?
I also found that few other websites had a similar attack but they use a specific kind of code to prevent their website. I cannot use those script because that's not suitable for me exactly.
The website is a core php website. If moderators found this question to be a not real question or an exact duplicate then before closing or hitting minus, please provide help with a link. I trust this website.
Are you using timthumb.php or a similar upload / linking script? Older versions are frought with XSS vulnerabilities. It's a very common vulnerability on Wordpress installations, especially those which use themes that come bundled with their own timthumb.php / thumb.php.
If that's the issue, lock that script down! If it's a custom script, take a look at the latest timthumb.php source code & try to use some of their techniques.
Also, make sure your file permissions are locked down for the apache / web users & groups. E.G., do NOT allow .htaccess to be writable by apache user/group!
How this .htaccess code was injected into the .htaccess file?
If it has been injected then either someone has got root on your machine (not going to conjecture why/how) or your permissions model is wrong.
If someone has root, then you are totally PWNed - brush up your CV and go looking for another job - you don't need to bother reading the rest of this post.
But it's much more likely that the permissions are wrong (but even this, on its own is not sufficient for the files to be modified - you have a vulnerability elsewhere).
You should be able to identify the primary vulnerability. If you don't know how to do this then get some help. In addition to fixing this, you need to fix the permissions on your site. Only specifically designated locations should be writeable by the webserver uid - and if these are inside the document root then you should take appropriate measures to protect your system from code injection (disable PHP access, preferably all webserver access, although this is still a lot less secure than keeping it outside the document root altogether).

Site Converter - Website Copier

Does anybody know of a software program that will convert a website built with PHP, JSON and jquery into a mainly HTML format. We need to do a conversion for SEO purposes and don't want to have to rewrite the whole site.
HTML is a language used for markup, PHP is an object oriented functional language. You cannot convert one to the other, I'm sorry.
If you're trying to make sure that you have nothing but .HTML extensions on your public URLs for SEO purposes:
Someone's selling you a line of BS.
You need access to your server configuration.
You don't have to convert anything but your links.
The .PHP extension is the default file extension configured to be sent from Apache to the PHP engine for parsing. You can change what file extension gets parsed in your configuration file.
http://encodable.com/parse_html_files_as_php/
This will allow you to keep .HTM files static and have .HTML files parsed as if they were .PHP files.
Try this: http://www.httrack.com/
It will only return a static HTML site. But it might be a good base for you.
Since the only thing which really knows what type of file you're using is the server itself, it does not really matter what you're using on the back end. Most search engines are smart enough to know that so they don't really care so much. Now, people might care. People might say, "Hm, well, this is .html, that means that this person must have a flat file which is constantly being updated," but I doubt it.
If you're really concerned about having a .html extension, then you can fake it by using htaccess:
RewriteRule ^(.*)\.html$ $1.php [L]
If that is placed in a .htaccess file at the root of your site, it will redirect all requests which end with .html to a corresponding page with .php. It will do that transparently both to the user and to the crawlers.
Of course, every link on your site will need to convert from linking to .php, but it will replace the impossible task of using only .html files with the annoying task of replacing all of your .php links.
As to removing JavaScript, well, you could do that, or you could design your site in such a way that it still uses AJAX but it works with the search engines instead of against them. The biggest trick is to make sure that your site can work with as little AJAX as possible and then use AJAX to supplement. We've come a long way from requiring that all websites work in lynx, but it is still good practice to make sure that they are still sane without the benefit of JS/CSS.
Besides, search engines are getting smarter. Google has been working to read AJAX intelligently since 2009. But even if they weren't, there are plenty of articles out there on using AJAX without hurting SEO.
There is no need to nerf your site because of SEO -- You can have your AJAX and SEO too.
This is hard to accomplish if there is a lot of dynamic data. For a simple website you can just cache every page and make that your new website. I am not sure how useful that would be. For example if you have forms or other user input fields then things will just not work. In any case this is how you do it using wget.
$ wget -m http://www.example.com/
More reading here.

How to hide the url in php

Is it possible to hide the the url in the address bar of the web browser so that it won't necessarily match the location of the files.
For example, this url:
http://localhost/exp/regstuds.php
You will always know by looking where to find the files in the computer.
Is it possible to distort or disarrange or hide the url in such a way that the location of the files will not be revealed
Yes, if you're using Apache look into using mod_rewrite. There are similar rewrite modules for pretty much all other web servers too.
I hope your sole motivation for doing this is not "security through obscurity". Because if it is, you should probably stop and spend more time on something more effective.
If you are hosting your php on an Apache server, you probably have the ability to use the mod_rewrite utility. You can do this be adding rules to your .htaccess file...
RewriteEngine on
RewriteRule ^RegStuds/ regstuds.php
This would cause http://localhost/RegStuds/ to actually render regstuds.php, but without ever displaying it in the address bar.
If you are on IIS, you can perform the same function using an ISAPI Rewrite Filter.
If you don't have mod_rewrite or an ISAPI Rewrite Filter, you can get a similar result using a folder structure, so you would have a physical path of RegStuds/index.php - and you would never need to link to "index.php" as it is the default file. This is the least recommended way of doing it.
No its not.
Each bit of functionality must have a unique identifier (URI) so that the request is routed to the right bit of code. The mapping can be non-linear using all sorts of tricks - mod_rewrite, front controller, content negotiation...but this is just obscuring what's really going on.
You can fudge what appears in the address bar on the browser by using a front-controller architecture and using forms / POSTs for every request but this is going to get very messy, very quickly.
Perhaps if you were to explain why you wanted to do this we might be able to come up with a better solution.
C.

One code, many websites

I need to develop a project that would allow me to instance many copies of a website, but each copy needs to be a separate website. I could upload the same code to many different accounts, but I would prefer to have only one copy of the code. Each website would be an "instance", so to speak. This way I could upload the code once and update all the websites at the same time.
For technical reasons I need to use PHP (but I'm interested in the other options too, for my own knowledge), and I thought Jelix could be a good choice of framework. Are there better options out there?
You can have all code in one directory, and then create virtual subdirectories in all your web sites, which all point to this directory. This is how Microsoft solves the problem in SharePoint.
The easiest bet is to have all the websites link to one server (perhaps distributed).
Pass the calling URL through your webserver to generate configuration information. Use those passed URLs to define the differences between each site.
Beyond that, the framework is almost immaterial to the question, so I'll leave it to someone else to answer.
Just remember, if you make 20 copies of the same code, that's 20x the time it'll take to fix bugs.
If you're using UNIX or Linux for a web server, you could create one master copy of the PHP code, and then use symbolic links to the actual files that are in separate directories with virtual websites set up in Apache. You could also put site-specific config files under those directories, but the bulk of the PHP code would be resolved as symbolic links to the "master" code.
I'm not sure what kind of websites you're talking about, but why not use an already developed application like Wordpress or any other cms? The code is identical on every website, and you can easily update it. The website-specific data is only present in the single configuration file, and the MySQL database.

Categories