Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
When making a php web site in dream weaver, does the site have to have htdocs folder ?
The problem I have is I have a domain www.whatever.com.
Once I created the index.php in dreamweaver, I hit the put button and it uploads just fine.
So the connection to my website from dreamweaver works.
But when I got to the website www.whatever.com , it shows an apache test page where I want the index.php to show.
The answer to your first question is "no". Every website functions differently and having a htdocs folder is not a requirement by any means.
I suggest trying to upload index.php to the root directory (folder). The "root" is basically the top level folder that you have access to on your hosting account. Then, if that doesn't work, keep trying folders until it does work. As other people suggested, the correct folder can be called "public", "public_html", "www" or something else.
Just be sure you remember where it is located for for future reference. And don't leave a bunch of index.php files scattered in various locations on your server or it could create problems on your website in the future.
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
Improve this question
I am using hosting provided by 000webhost.com.
Root directory is: public_html
I set it's file attributes to 700 by using FileZilla ftp.
Also I set individual file permissions to 600.
public_html directory has only one file which is index.html.
Now even though I have set puclic permission to zero as evident from right-most zero in both 600 and 700. Still I am able to view file index.html by using web browser here is the link. Why is that? I thought last zero in 600 or 700 meant public would not be able to view file, what is happening now then?
Now, I think this is because the file is readable by your Apache user and it is Apache that is delivering the files to the browser. What is the ownership of the file/folder. Can you put these secure files outside the web root?
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 6 years ago.
Improve this question
I am trying to create a basic login setup through Laravel. I used the following guide to help me get started:
https://scotch.io/tutorials/simple-and-easy-laravel-login-authentication
After completing the steps, I used this guide to help me upload the files to godaddy's web hosting services:
https://medium.com/#kunalnagar/deploying-laravel-5-on-godaddy-shared-hosting-888ec96f64cd
I followed the guide to the letter but I'm still having some issues. After I moved the content of the "public" folder to the public_html folder and left the remaining files in the Laravel folder which was located in the home directory, it was supposed to allow me to type in:mydomain.com/awesomeproject and see the laravel login that was created. Instead it loads to a blank page. Here are pictures of what I am referring to:
I might be messing up the following step:
Because when it asks me to change the path to reflect a new directory, I'm not sure exactly how that's setup. This is the current path that I put:
require __DIR__.'/../Laravel/bootstrap/autoload.php';
$app = require_once __DIR__.'/../Laravel/bootstrap/start.php';
The autoload.php and the start.php files are located in Laravel->html->bootstrap. I'm not sure if this has something to do with it. Theoretically after doing these steps correctly I should be able to see my laravel application on my domain but so far I've only gotten a blank screen. I'm very new to these things so I don't know how much of this I am getting totally wrong. I essentially just want to get this login thing working on my site using laravel. Any help is greatly appreciated. Thanks.
These screenshots show the contents of the Laravel folder and the content of the html folder within the Laravel folder.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
I have LAMP installed in my server and I use virtualhosts to map domains to subdirectories. I need to allow my customers to upload files (including php) to their server using FTP.
The problem is that a customer using a domain xxx.com.br uploaded a file test.php and executed it like:
xxx.com.br/test.php
The content of test.php if file_put_contents("../../xxx.txt","teste") and it worked! The file xxx.txt was created 2 levels above his domain folder! How do I prevent this from happening?
Don't give the PHP process access to directories it isn't meant to reach.
That's kind of the point of the whole permission system.
In Linux, PHP will generally run as its own user, just make sure that user doesn't have read or write permission to any files you don't want exposed.
For this purpose exists open_basedir configuration directive. More information about it for example here.
Moreover it is good to use FastCGI which allows each script to be run under its owner. More information about it for example here.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
I have some files in directory and sub directory in an open HTTP site
For Example:
http://example.com/directory/file1
http://example.com/directory/file2
http://example.com/directory/sub-directory/file1
http://example.com/directory/sub-directory/file2
http://example.com/directory/sub-directory2/file1
http://example.com/directory/sub-directory2/file2
I want to copy the full directory to my server.
I don't have SSH or FTP access to the http://example.com
I have tried transloader script which grabs only one file every time.
I need to copy the full directory exactly as is on the HTTP server to my new server.
Thanks
Use wget or curl:
wget -r --no-parent mysite.com
You are unable to do this. You can grab the content of the visual layer/GUI that the site provides to you, but you can not grab any of the "behind the scenes" pages which the site has. You wont be able to get any of the site which is doing the back end processing to create what you see on the front end.
The only way to do do this is if you have access to the directories on the site. By this, I mean when you go to the base directory, such as example.com/test/, it just gives a list of all possible files in that directory. As it stands though, most sites protect against it, therefore unless you have direct access, this is not doable as it would be entirely insecure and would create many headaches for development and privacy.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
I'm making a website, which is a forum. I'm doing this for the first time.
I've uploaded some content on 000webhost for testing my website through FTP. The problem is, there is no index.html file there, but there is an index.php file there.
It works perfectly on localhost. What should I do?
The error is webpage is not available.
Here is my config file setting as I used to do it in local host .. is there any problem there ?
define('MYSQL_HOSTNAME', '127.0.0.1');
define('MYSQL_USERNAME', '1234');
define('MYSQL_PASSWORD', 'demo');
define('MYSQL_DATABASE', '1234');
It seems the service you are using always listens for an index.html, so I see you only have three options:
Contact your web hoster and ask how to change the main page
Make an index.html page, and have it redirect to index.php
Make an index.html page, and make an iframe in it with an index.php src
What error is it throwing exaclty? a 404 error?
Your index.php file is in the root directory?
Could you provide the web address?
Most of the time your localhost installation will be different from the web hosting you are using. And there are a lot of things that could be the problem.