I want to run a php file with Heroku Scheduler. What are some methods to ensure that not just anyone can come along and execute the file? Is there a way to put stuff above the web root ('www' with a php app)?
The easiest way to accomplish this is to use a .htaccess file in your project's root directory to ensure that these files are not accessible through Apache. Scheduler will still be able to execute them.
<Directory /app/www/DIRECTORY_NAME>
Order Deny,Allow
Deny from All
</Directory>
With DIRECTORY_NAME being the name you've put these PHP files in.
Related
I have an internal webserver with ubuntu and Apache configured on it.
I have given the access to /opt/data_upload so that I can make use of this directory to save images uploaded from PHP and fetch it back on Ajax Get Request.
My Apache config in /etc/apache2/apache2.conf looks like this
Alias /data_uploads "/opt/data_uploads/"
<Directory "/opt/data_uploads/">
Options Indexes FollowSymLinks MultiViews
Require all granted
AllowOverride all
Order allow,deny
Allow from all
</Directory>
But the problem is that when I do http://123.45.67.89/data_uploads from browser it is fully accessible to everyone which is dangerous and anyone can see the images uploaded there.
To avoid this i tried to Require all denied now i get 403 but also my all Ajax get requests are also failed.
Now i want to make my website to access it but if someone tries to access http://123.45.67.89/data_uploads should say 403, How can i overcome with this issue ?
In your configuration you give access to everyone to your upload directory.
You must remove this, or only allow your IP.
But in your case, what you want is to permit your users to upload, and download files that they are allowed to.
It means you want their http requests be able to upload/download files. These http request won't access the upload directory directly but they will call your php application.
Then it's your php application that would be able to upload to this directory (then write to this directory) and read from this directory.
For this you have to give read/write permissions to the apache user running process with something like chmod and/or chown.
And finally, you'll have to write a PHP controller able to treat upload and download calls. That php code will write and read from your upload directory.
In a Laravel project I have to include multiple projects, so that they are accessible at /example.
These projects have the structure of
/example
- index.php
- main.css
- app.js
(Usually there are more files then that.)
I have tried to use Redirect::to("example/index.php"), however this breaks all the <link>'s & <src> (where I would need to prepend /example to all of them.
This would theoretically work, however I would rather not store these files in the Laravel project itself, since they are basically self-contained projects.
What is the best way to include such external projects?
This would theoretically work, however I would rather not store these files in the Laravel project itself, since they are basically self-contained projects.
That's an excellent approach. Rather keep Laravel as Laravel and host the stuff just outside of your Laravel project.
Since you're using Apache, here's how to create a Virtual Host for that external project.
Please note - I'm assuming that your project lives in /var/www.
Decide on a URL for that project - I would use example.mylaravelproject.com. But anything will do.
Confirm the path of your project folder. For this example, we'll assume it's /var/www/example/
Run the following command (assuming you're using Ubuntu) - sudo nano /etc/apache2/sites-available/example.mylaravelproject.com.conf
Ensure the new file has the following contents:
<VirtualHost *:80>
ServerAdmin <your email address here>
ServerName example.mylaravelproject.com
DocumentRoot /var/www/example
ErrorLog ${APACHE_LOG_DIR}/error.log
CustomLog ${APACHE_LOG_DIR}/access.log combined
</VirtualHost>
Save and close the file
Run the following command sudo a2ensite example.mylaravelproject.com.conf
Run sudo nano /etc/apache2.conf
Make sure that this line appears somewhere in this file (preferrably in a <Directory> tag - AddType application/x-httpd-php .php
Then restart Apache by issuing the following command sudo service apache2 restart
Technically now your site has a valid vhost and should be working.
If you're doing this on a local environment and want to access your example project via the browser, you'll need to complete a few more steps:
sudo nano /etc/hosts - again, assuming that you're running Ubuntu
Add this line somewhere to your project: localhost example.mylaravelproject.com
Save and close that file. You should now be able to access that url via your browser.
If these steps don't work, it's likely that Apache isn't parsing the PHP files. If that's the case, try these links for some good answers on making Apache parse PHP:
Apache 2 server on ubuntu can't parse php code inside html file
Apache Virtual Host not parsing PHP
I got this task form school, to make a PHP web application. But I don't really understand what this requirement might mean
It should be possible to run this application outside the domain root
e.g. sample URL: http://localhost/task/.
I searched a little bit on the internet but was not able to find anything that I could understand ?
I have wamp, and the folder where is my sites is wamp/www/task
When they say "outside of domain root" it means that you should not be forced to go to
http://localhost/yourfile.php
but you could put it in a subdir, like
http://localhost/task/yourfile.php
What they want you to do is harder to guess, but it could mean that you need to be able to run it in any subdir, so take care of you imports to be able to handle that (e.g.: not hardcode the dir you're working in).
Domain root seems to be at localhost, this just means it should be easy to rename your web application folder and make it still work at anywhere.
# http://localhost/task
$ cd wamp/www/
# http://localhost/task2 - should be accessible without you needing to change anything
$ mv task task2
From technical point of view, you should use relative path for all your links and images as well as external resources such as javascript / css files
you can set vitual host for your web server & access your PHP Application likw www.oorja.local
in the wamp server, just add below code at end of your httpd.conf file, which allow you access your PHP application without localhost, Document root and Directory have your physical pathe of your application directory.
ServerName oorja.local
DocumentRoot E:/LAMPSYSTEM/wamp/www/oorja/public
<Directory E:/LAMPSYSTEM/wamp/www/oorja/public>
DirectoryIndex index.php
AllowOverride All
Order allow,deny
Allow from all
</Directory>
I created a cron job through goDaddy control center.
The cron job is in the folder "cron jobs".
I don't want anyone to be able to run it, how should I set the permissions of the folder so that it can't be publicly opened but it still can be used for the cron job?
Will unchecking Public > Read be enough to prevent anyone from running it?
Just put the files outside of the webroot/document root folder.
In .htaccess add this.
<Location /cronjobs>
order deny,allow
deny from all
allow from 127.0.0.1
</Location>
I included allow from 127.0.0.1 so it can be run from the server, i.e. so the cron can still run.
Another possible solution if the file is meant to be used exclusively as an include() and not ran standalone by a user who enters it in the url.
Place this code at the top of the file you want to block direct calling of.
if(basename($_SERVER['PHP_SELF']) == 'blockedFile.php')
{
header('Location: ./index.php');
exit();
}
PHP checks if the file's name is the one being ran directly. If blockedFile.php were included in index.php with include() then basename($_SERVER['PHP_SELF']) would equal index.php. If it were standalone, it would equal blockedFile.php and send the user back to the index page.
Put it in a directory, and in that directory create a file called .htaccess with this inside:
<FILESMATCH "\.php$">
order deny,allow
deny from all
</FILESMATCH>
Now only the server can access PHP files inside that directory. Example, by include or require.
This is useful for keeping your MySQL password safe, you can put the connection function inside a PHP file in this "protected" directory and include it into your scripts.
One option that you have is to use the $_SERVER values to see if it is a web request or a cli request.
See http://php.net/manual/en/reserved.variables.server.php
I would look at checking to see if the $_SERVER['argv'] value is set at the start of your script(s). If it's not set then exit the script.
Alternatively you can check to see if $_SERVER['SERVER_ADDR'] is set, which would mean it's being executed by the webserver.
Note that I don't have a godaddy account handy to test this, so ensure you verify before going live.
I have a processing file for my website's payments. It works just fine, but what I would like to do is log all the requests to this page so that if anything throws an error, the raw data is saved and I can process the transaction manually. The processing file uses fopen to write to the log file in another directory.
What I have right now is a separate folder on my root directory with permissions 755. Then a log file inside with permissions 777. The processing file that writes to the log file, in PHP if that matters, is set to 777.
This works right now, but the log file is publicly available. I know I can be doing this better and that the permissions aren't correct. How can I do this better?
Put the log file outside the document root. The PHP script that writes to it will still be able to get to it (via the full path) but Apache won't be able to serve it.
I came across this whilst searching the answer for myself. I don't believe there is a simple "permissions fix" to do what you want and perhaps the safest way is to put the log files outside of public_html directory.
However this can be a nuisance sometimes - especially if you are wanting to e.g. catch paypal ipn dump text in a log file, but not have it publicly accessible.
In such cases, you can use .htaccess file directives to allow write from script, but deny reading from public access.
For example, this works for me (Apache .htaccess in root public_html folder);
<FilesMatch "mycustom\.log">
Order allow,deny
Deny from all
</FilesMatch>
and if you have multiple logs you want to protect, use it like this, with "Pipe Separated";
<FilesMatch "mycustom\.log|ipn_errors\.log">
Order allow,deny
Deny from all
</FilesMatch>
It is worth noting that the above directives are deprecated as of apache 2.4 and you may wish to consider using more current directives instead: https://httpd.apache.org/docs/2.4/howto/access.html
Hope that helps you!