Codeigniter - HTML not getting rendered - php

I have deployed my PHP application built on Codeigniter. When I call the root URL, browser is successfully redirected to login view but not HTML is rendered.
What can be the issue?

Did you deploy it to a different computer than your development machine. Maybe you are using "nice urls" and the target server does not allow url rewriting with htaccess?

If nothing is being shown in the browser it is worth checking the following:
system/cache is writeable
system/logs is writeable
GZIP compression is disabled in application/config/config.php
telnet port 80 and see what you get back, any headers?
tail the logs to watch for errors

It could be related to the way you load views.
Loading a view this way sends it straight to a browser:
$this->load->view('foo_view.php', $data);
But if you have an optional, third, parameter set to TRUE, it will return data as a string so you have to echo it:
echo $this->load->view('foo_view.php', $data, TRUE);
More info at the bottom of this CI User Guide page.

There should be proper deployment otherwise you are bound to see issues. This may prove helpful to you.
Deploying CodeIgniter (or any PHP projects) to the live site?

Related

Forbidden access but working with a refresh

I'm trying to access a streaming page but I get the error "Forbidden. You do not have permission to access this document."
However I can skip this message with F5/refresh and watch the video.
Is there any way to open this URL and do a refresh automatically? (using PHP)
I've tried something like this but it does not seem to work
header("Refresh:0; url=http://www.url.com");
Thank you in advance.
The forbidden access message is most likely coming from your web server configuration (apache?). The browser will stop there, and no document will be loaded from your server.
Since PHP is only interpreted after that, it will actually not get interpreted at all... you have no way to override this behavior in PHP alone, you need to fix your configuration on the server.
If you have a behavior that shows a 403 once every two load, chances are you have either a load balancer type of setup (loading one server or the other), or something that alternate between two configuration (for example issues in your domain name configuration, like the ServerName configuration).
In that case, if you add a header("Refresh:0; url=http://www.url.com"); in your page, you will only get this worse, since the successful load will reload, and then get back to the forbidden message (403).
check your web server config and logs to find the issue.
It's more likely the page is using HTTP REFERER restriction from it's own domain.
If you accessing the page by code, just added the referer condition on it.
In php you can use curl and add this line
curl_setopt($ch, CURLOPT_REFERER, 'domain_url');

How to make apache serve file:///

I'm kind of new to HTML, PHP and stuff.
I'm trying to test locally a web site before putting it online. I set up my Apache server with PHP and MySQL and made a virtual server point localhost to
"C:/path/to/docroot/"
and everything works great.
Now in my index.html I have a link pointing to the file "mail-form.php" in the same directory (the C:/path/to/docroot/). In this link I only specified
href="mail-form.php"
and no absolute path, because I don't know which would be the absolute path in the production server and I don't feel like changing them all after testing. (Here I'm open for suggestions, if this is bad).
Now the thing I don't understand is the following: when I type in the browser's URL
"file:///C:/path/to/docroot/index.html"
and then click on the link, the browser tries to open
"file:///C:/path/to/docroot/mail-form.php"
and this doesn't get interpreted by PHP, but returned as text.
If I instead type in
"localhost/index.html"
and then click on the link, the browser calls
"localhost/mail-form.php"
and it gets interpreted properly showing what it should.
I can obviously live with this, but I'm curious if there is a way to make Apache/PHP serve the "file:///..." thing just as well as the "localhost/..." thing? After all they are both the same file. Or is it a browser problem? Or am I thinking wrong?
You can't make Apache serve file:///. Using that scheme instructs the browser to fetch the file directly from the filesystem. If you want to use Apache then you have to use http:// (or another URL scheme that makes a network request that Apache supports).
No. The file:/// protocol is not HTTP. The browser won't even send it to the localhost server you're running, and instead just read the file.

failed to open stream: no suitable wrapper could be found

hello i am implementing php files from one website into another and here is the following error message i am getting when trying to open the following page with implemented php files:
http://www.holidaysavers.ca/europe-destinations-canada.php
basically the php files i am importing from one website into another are identical , however they work on the original website but when i implement them into a new website it does not work anymore.
could you assist me in trying to get this resolved?
thank you
You can't include a PHP script that is on an external website/server into your local script - unless you enable allow_url_include on your php.ini (if you have access to it)
Instead, you can let that website/server render the page and get the resulting html output on your local script.
Replace this line in your script:
include('http://www.holidaysavers.ca/europe-canada.php?detour');
With this:
echo file_get_contents('http://www.holidaysavers.ca/europe-canada.php?detour');
Could you post the code from "europe-destinations-canada.php"? It looks like the script is asking to do stuff that's not configured in your php setup on this new site/server
I don't really know what kind of host you are using or if you are using Xampp, I do have an easy fix to it, for xampp and possibly other web server software. Go to your php.ini file, which you can search for or just look for it in c:\\xampp\php\php.ini, the php.ini should be in the php folder in the server software folder. Now search for allow_url_include in the php.ini file and than replace Off with On, if it isn't already on or something. This is most likely the fix because it worked for me.
I might be able to help further if I know if you are using a hosting or home server. If you are using a hosting website than please share what kind of hosting service you are using so I could inspect it further.
Using as example a random remote php file.
The goal is to use this remote file locally, make sure it hasn't change or be altered. The remote file will be downloaded one time only.
Hard coding the sha256 signature avoid to use the network on startup. This is just a base that can be turned to many scenarios, like checking for updates, depending your needs.
<?php
$lib_url = "https://raw.githubusercontent.com/getopt-php/getopt-php/master/src/CommandInterface.php";
$lib_filename = basename($lib_url);
// SHA256 signature
$lib_signature = hash_file("sha256",$lib_url); // "dba0b3fe70b52adbb8376be6a256d2cc371b2fe49ef35f0c6e15cd6d60c319dd"
// Hardcode the signature to avoid a network call on startup:
//$lib_signature = "dba0b3fe70b52adbb8376be6a256d2cc371b2fe49ef35f0c6e15cd6d60c319dd";
if (!is_file($lib_filename) || $lib_signature != hash_file("sha256",$lib_filename)){
// No local copy found, or file signature invalid, get a copy
copy($lib_url, $lib_filename);
}
require $lib_filename;
It is very useful if you intent to share a program as a single file, without composer.
For the case of a file hosted on Github, an ETag HTTP header is provided, it can be used to avoid to download the whole file.
php -r 'var_dump(json_decode(get_headers("https://raw.githubusercontent.com/getopt-php/getopt-php/master/src/CommandInterface.php", 1)["ETag"]));'
//string(64) "c0153dbd04652cc11cddb0876c5abcc9950cac7378960223cbbe6cf4833a0d6b"
The ETag HTTP response header is an identifier for a specific version
of a resource. It lets caches be more efficient and save bandwidth, as
a web server does not need to resend a full response if the content
has not changed.
Warning: include() [function.include]: URL file-access is disabled in the server configuration in /home/content/91/8151691/html/HolidaySavers.ca/europe-destinations-canada.php on line 52
says it all. I believe this is called XXS. It appears you're attempting to include a URL based file which is denied in your server configuration which is either one of two things.
You're attempting to include the file on site B from site A which you would then use instead of include('WhateverFile'); file_get_contents('WhateverFile'); however this will only return the client side data as it is an HTTP request;
You've duplicated the file on site B and forgot to update the domain configuration. Be sure that the include path reflects the site you're running the script on ie.
include(dir($_SERVER['SCRIPT_FILENAME']) . DIRECTORY_SEPARATOR . 'WhateverFile.php');
In any case. I would have to actually examine the line 52 on the said file to see why PHP is complaining to you in detail lol

How to hide apache server version?

1] When i open any web page & view source it. We have JS & CSS. While click on that i'm able to see each JS & CSS file.
Can we manage it like while on click .js & .css file not open
2] Website hosted on linux server. Through tools hackers are able to view Apache server version. Any way to hide it?
No, JavaScript and CSS files are open to the browser, hence they are opened to the user. You can try to obfocusate them, but that will only stall a determined user.
The server version is shown when an error (404, 500) occurs, you can override that default message with your own pages using .htaccess.
CSS and JS files will be seen by anyone and everyone . They work on HTTP ;) BTW... what you want to hide is your server-side code, like asp,jsp,php. The source code of those files cannot be seen.
To your second question:
Open your .htaccess and put the following it.
ServerSignature Off
This way, for any error, it will not show what server or version you are using.
Hope it answers you :)
When an user gets programm he gets code which will be executed. On the desktop it may be binary data for example. In web development there is no binary data, browser gets code, so user does. You may only obfocusate it.

Php page protection for cron task only

I am using linux cpanel shared hosting.
Am using http://aaa.com/script.php to scrape data from other website.
PHP portion is to curl call to read whole page content, then on the page, will output the full content as html, then use jquery scrapping & ajax call to insert final data into mysql.
(I decided to go for jquery client side scrapping because the page with html to scrap is pretty complicated, and hard to achieve with phpsimpledom and regex.)
I want this page to stop outputting html when it is
- not open by me as a tester
- not open by local cpanel cron task.
So I put exit(); at the top few lines.
If detected is legitimate, then will continue the rest of the html outputs at bottom, else, just exit and show an empty page.
Now is security issue, what's the possible and best way for me to make sure other visitors/bots to this page will see empty page.
If I put a password to cron task, I don't think it can work right?
Because at script.php I am scrapping data, so if the website owner see the visitor referral log, he can see the full url including ?password=12345, isn't it.
/usr/local/bin/php -f /home/mysite/public_html/dir/script.php?password=12345
If I put my script outside of public_html, like /usr/local/bin/php -f /home/mysite/script
I don't think it will work for jquery, it is purely for php isn't it?
What else I can do??
You could config apache's virtual host to only allow access from your ip.
Anyone else would get a 404 page not found or 403 permission denied depending how you configured it.
Here's a sample
Order Deny,Allow
Deny from all
Allow from 127.0.0.1
Using 127.0.0.1 tells apache to let requests from itself (ie cron) to work but noone else.
You get learn more by reading the apache2 docs
Passwords on the query string are a bad idea. You could check for valid IP addresses at the start of your PHP file. This will allow any request from a set of IP addresses to access the parsed jQuery output. All other IPs will be denied access.
$allowedIps = array('127.0.0.1','::1');
if(!in_array($_SERVER['REMOTE_ADDR'],$allowedIps)){
echo 'No jQuery for you';
}else{
echo 'jQuery goodness to follow...';
}

Categories