I have a script that builds a document into a pdf after a form has been completed, it's a custom template script for Gravitypdf, after the user has submitted the form they click on a link to generate and download the pdf.
If I exit() the script at the very end of the template code it displays the plain text page generated for the compilation in the browser without making the pdf. So far so good.
I want to have working the existing link to the pdf and an alternative link to the html (text in browser) version on the confirmation page after submission.
I am trying to code that if the url happens to be xyz (the link for the html version) then execute the exit() command and output the html version. If it is anything other than that url then ignore that command and fully compile the pdf eg when the alternative pdf download link is chosen.
I thought initially I should be looking at using exit(header('Location: xxxxx.php')); but found this is a redirect, I then looked at <?php if
($_SERVER['REQUEST_URI'] === 'https://example.com/pdf/5a47cec7d4b62/{entry_id}/textversion') {
exit();
} but that's not working.
How should I go about doing this? (The {entry_id} part of the url varies for each fillout of the form when the link generates via Gravitypdf - that works fine).
Thanks in advance.
EDITED UPDATE: I managed to get it working and tied in preg_match using ([^&]*) for the variable in the url. My clickable link for user is eg. example.com/pdf/5a47cec7d4b62/{entry_id}/textversion
`<?php
function currentUrl( $trim_query_string = false ) {
$pageURL = (isset( $_SERVER['HTTPS'] ) && $_SERVER['HTTPS'] == 'on') ?
"https://" : "http://";
$pageURL .= $_SERVER["SERVER_NAME"] . $_SERVER["REQUEST_URI"];
if( ! $trim_query_string ) {
return $pageURL;
} else {
$url = explode( '?', $pageURL );
return $url[0];
}
}
?>
<?php if
(preg_match("#^https://example.com/pdf/5a47cec7d4b62/([^&]*)/textversion#",
currentUrl())): ?>
<?php exit() ?>
<?php endif; ?> `
Related
I have a tracking link (poly.wtf) .
I'm trying to make a php page that check the referred if and to make this condition if the visitor came from my tracking link (poly.wtf) it will redirect him to page X
if the visitor didn't come from my tracking link (anywhere else) or just type the url through the browser it will redirect him to page Y
I'm tried to use this script PHP :
<?php
$domain = 'poly.wtf';
$referrer = $_SERVER['HTTP_REFERER'];
if (preg_match($domain ,$referrer)) {
header('Location: https://www.google.com/');
} else {
header('Location: https://www.yahoo.com/');
};
?>
but it doesn't work because for some reason it doesn't capture for me the refer here is example of redirect from poly.wtf link to the PHP page :
http://poly.wtf/5d9e31ce-bd7d-4500-bc4e-b7e7f7ddcacc
^^ in this case it should redirect me to google but it doesn't , what do you think is the problem?
You shoul enclose your pattern in some delimiter like:
$domain = '/poly.wtf/';
if (strpos(file_get_contents($_SERVER['SERVER_ADDR']),'<html>')) {
echo 'true';
}
I'm trying to check if the current page contains a string, and if so echo true. I've tried this however it doesn't work.
$_SERVER['SERVER_ADDR'] contains a completely different value than you seem to be expecting:
The IP address of the server under which the current script is executing.
To get a working link to the current page:
$currentUrl = $_SERVER['HTTPS'] ? 'https://' : 'http://';
$currentUrl .= $_SERVER['HTTP_HOST'].$_SERVER['REQUEST_URI'];
However, this becomes a giant mess, if you're requesting your own page from that page's code you'll get a clusterf*ck of endless recursive retrievals which will eventually blow up your server. I would 'recommend' not downloading the page in which you are doing this.
I want to display different pages in iframe, according to the parent url loaded. I have a mysql DB, which will be used for this purpose. I use the following code to call the appropriate link from the DB, but instead of showing that page, the iframe shows nothing (blank space).
This is the code, which I insert in my php files:
function curPageURL() {
$pageURL = 'http://';
$pageURL .= $_SERVER["SERVER_NAME"].$_SERVER["REQUEST_URI"];
return $pageURL;
}
$conn=mysql_connect("localhost","user","password");
echo mysql_error();
mysql_select_db("db-name",$conn)or die(mysql_error());
$url=curpageURL();
$url=substr($url,0,strpos($url,'.',strpos($url,'?')));
$rs=mysql_query("select * from table-name where column1='".$url."'",$conn);
$affi=mysql_fetch_array($rs);
echo mysql_error();
echo '<iframe src="'. $affi['column2'].'" width="700" height="400"></iframe>';
As I said, this code shows only blank space and not the appropriate page. My site is running on SMF engine (Simple Machines Forum).
I hope someone can point me out what has to be changed in the code.
Thank you!
block php page in public view. but still able to use by other page.
hi, I'm working on a website that has ajax live search(search.php) on it, search.php calls in from another php page to search in database, it works just fine, the problem is search.php can be typed in url and display all data from database. I tried googling it, still don't have clear idea how to solve it. I've read that it can be done in .htaccess, also by changing permission... I just want to be enlightened how to properly fix the problem. thanks
how I would do it would be by adding another $_POST variable to the form.. something like the following..
$.post('search.php', {search: 'search term', secret: 'ajax'}, function(r){
$(insert).html(r)
})
and then on the post page..
if(isset($_POST['secret']) && $_POST['secret'] == 'ajax'){
// do your submitting..
} else {
// display some kind of error message since the secret post wasnt found
}
Try something like this in the first lines of search.php file
/***************DO NOT ALLOW DIRECT ACCESS************************************/
if ( (strpos( strtolower( $_SERVER[ 'SCRIPT_NAME' ] ), strtolower( basename( __FILE__ ) ) ) ) !== FALSE ) { // NOT FALSE if the script's file name is found in the URL
header( 'HTTP/1.0 403 Forbidden' );
die( '<h2>Forbidden! Access to this page is forbidden.</h2>' );
}
/*****************************************************************************/
I solve the problem by having non-landing PHP pages live outside the public html folder, often in a /php or /lib folder just above the public html folder. This kind of assumes that all your landing pages include some form of bootstrap that sets up your include path to include the /php folder.
Another easy-ish solution is to have your landing php files set a value or constant, and have your includes refuse to execute if the value isn't set - This way you cannot execute the php file directly.
index.php
<?php
define(VALID_REQUEST, 1);
include "include.php"
...
include.php
<?php
if (!defined(VALID_REQUEST)) { exit; }
...
Try this make sure to place it at the top of the search.php page :D
if(basename(__FILE__) == basename($_SERVER['PHP_SELF'])) send_404();
function send_404()
{
header('HTTP/1.x 404 Not Found');
print '<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">'."\n".
'<html><head>'."\n".
'<title>404 Not Found</title>'."\n".
'</head><body>'."\n".
'<h1>Not Found</h1>'."\n".
'<p>The requested URL '.
str_replace(strstr($_SERVER['REQUEST_URI'], '?'), '', $_SERVER['REQUEST_URI']).
' was not found on this server.</p>'."\n".
'</body></html>'."\n";
exit;
}
Hi I have the following page which sets a cookie with the current URL and also a simple external link.
<?php
function pageURL()
{
$pageURL = 'http';
if ($_SERVER["HTTPS"] == "on")
{
$pageURL .= "s";
}
$pageURL .= "://";
if ($_SERVER["SERVER_PORT"] != "80")
{
$pageURL .= $_SERVER["SERVER_NAME"].":".$_SERVER["SERVER_PORT"].$_SERVER["REQUEST_URI"];
}
else
{
$pageURL .= $_SERVER["SERVER_NAME"].$_SERVER["REQUEST_URI"];
}
return $pageURL;
}
$CurrentPage = pageURL();
setcookie('linkback', $CurrentPage);
?>
<p>External Link</p>
What I want to do is using PHP add a prefix to all external links so that they have have the following structure:
localhost/outgoing?url=http://www.google.com/
This loads up an outgoing page like so:
<?php
if(!isset($_GET['url']))
{
header('HTTP/1.0 404 Not Found');
}
?>
<h1>Warning! Now leaving website</h1>
<ul>
<li><a title="Return to Website" href="<?php if(isset($_COOKIE['linkback'])) { echo $_COOKIE['linkback']; } else { echo 'http://localhost:8888/creathive/'; } ?>">Return to Website</a></li>
<li><a title="Continue to <?php echo $_GET['url']; ?>" href="<?php echo $_GET['url']; ?>">Continue to <?php echo $_GET['url']; ?></a></li>
</ul>
The idea is that using the cookie set in the previous page I can have a simple back button, and also grab the url from the query and allow the user to continue after being warned they are leaving the site.
The problems I have are:
1.) Prefixing external URLS so that they go to the outgoing page
2.) The isset on the top of the outgoing page is supposed to be throwing a 404 if a user visits the outgoing page without a url query string but isn't
3.) Need to make sure that URLS are valid so for example prevent this for happening: localhost/outgoing?url=moo
You will need to replace every external URL in your code according to the new scheme. There is no way doing this automaticalle for all outgoing links.
This is because, if the user clicks on an external URL, the request is not sent to your server, but the external one.
Edit:
The only thing you could do is caching all your output with ob_start() and then replace all links using a regexp, before printing them on your page.
But the better solution would be to replace all links yourself.
Can you elaborate?
Isset isnt exactly the "right" tool. try arrya_key_exists('url', $_GET)…. Code like this should do ya.
function haveUrl() {
if (array_key_exists('url', $_GET)) {
$url = $_GET['url'];
if (!empty($url)) return true;
}
return false;
}
You can simply check to see if they start with http:// or https://... This may be done best with a regex…. something like…
function validUrl($url) {
if (preg_match('\^(http:\/\/|https:\/\/)\mi', $url) {
if (preg_match('\localhost\mi', $url) return false;
else return trus;
}
return false;
}
1) I would create a class for all links, and use something like $links->create("http://...")
and there you place the redirection and everything you might need.
3)
You could use the code in the following link:
link text
1.) Prefixing external URLS so that they go to the outgoing page
You need a DOM tool for this. For instance SimpleXML or Simple HTML DOM parser or any of it's kind to manipulate the outgoing links on the page you are rendering. (Find more available options in this question.)
Optimally, you will cache the results of this operation as it can be quite expensive (read: slow) and update the cached version on update or after a defined period of time.
2.) The isset on the top of the outgoing page is supposed to be throwing a 404 if a user visits the outgoing page without a url query string but isn't
Yes it does, but you need to stop execution after this point if you don't want to render the rest of the page. A 404 error can - and should! - have a page body, it's a response like any other HTTP response.
3.) Need to make sure that URLS are valid so for example prevent this for happening: localhost/outgoing?url=moo
Even if you do so - and indeed you should, nothing will prevent anyone from accessing localhost/outgoing?url=foo by manual URL manipulation. Url parameters, or any other user input can never be trusted. In other words you need to check the outgoing url in the outgoing-script no matter what.
And don't forget to sanitize rigourously! Functionality such as this is screaming to be abused.