PHP Array to string conversion with preg_match - php

I have this error while I'm using this my script:
$pages = array('/about.php', '/');
//...............function text here................//
$ua = $_SERVER['HTTP_USER_AGENT'];
$mobiles = '/iphone|ipad|android|symbian|BlackBerry|HTC|iPod|IEMobile|Opera Mini|Opera Mobi|WinPhone7|Nokia|samsung|LG/i';
if (preg_match($mobiles, $ua)) {
$thispage = $_SERVER["HTTP_HOST"].$_SERVER["REQUEST_URI"];
if ($thispage == $_SERVER["HTTP_HOST"].$pages) {
ob_start("text");
}
}
This script changes certain pages style depending on user's useragent. I need this script in such way. But I don't know how to make it in PHP properly. Maybe I need some "foreach ($pages as $i)"? But it didn't work in a way I made it.

You are trying to check if the "requested resource" $_SERVER["REQUEST_URI"] is in predefined list of resource paths.
Change your condition as shown below(using in_array function):
...
if (in_array($_SERVER["REQUEST_URI"], $pages)) {
ob_start("text");
}

Related

Retrieving whole URL, even with ampersands, using _GET

I made a quick script to retrieve and parse XML. It's simply for internal use and I though that appending the feed URL to the script address would be a convenient way to initialize the script...
www.example.com/feed_analyzer.php?url=www.example.com/an_xml_feed.xml
Then I simply grab the URL...
if (isset($_GET['url'])) {
$xml_url = $_GET['url']
}
...retrieve the file at $xml_url, parse etc.
All was fine until this URL came along with pesky parameters:
www.example.com/an_xml_feed.xml?foo=bar&rice=chips
That of course left me with the URL "www.example.com/an_xml_feed.xml"
I have managed to "patch back together" the whole URL using this clunky code:
if (isset($_GET['url'])) {
foreach($_GET as $key => $value){
$got .= "&".$key."=".$value;
}
$xml_url = ltrim($got,'&url=');
}
Can someone please suggest a more elegant approach.
You can directly use this to get the whole query url:
ltrim($_SERVER['QUERY_STRING'], 'url=');
urlencode() is the way to go :)
Try like this
if (isset($_GET['url'])) {
$query_string = $_SERVER['QUERY_STRING'];
if (!empty($query_string)) {
list($key, $xml_url) = explode('url=', $query_string);
echo $xml_url;
}
}

How to include 2 checks in a if statement

Basically I am coding a script where it simply redirects the user to the destination page. And I want to be able to check if multiple websites are not equal to the value; if this is so, it will run a error, else it will proceed.
I can't seem to get this to work though, although I am sure there's a way to check multiple values.
<?php
$url = $_GET['site']; // gets the site URL the user is being redirected to.
if ($url != "***.co", "***.net")
{
echo ("Website is not valid for redirection.");
} else {
echo ("You are being redirected to: " . $url);
}
?>
You can make an array of items to check for and then check if the url is in the array:
if (!in_array($url, array("***.co", "***.net")))
{
}
You can also use multiple conditions like #wrigby showed, but the solution using an array makes it easier to add more (or a dynamic number of) urls. But if there are always two, his is better.
You'll need two complete conditionals, connected with a logical and (&&) operator:
<?php
$url = $_GET['site']; // gets the site URL the user is being redirected to.
if ($url != "***.co" && $url != "***.net")
{
echo ("Website is not valid for redirection.");
} else {
echo ("You are being redirected to: " . $url);
}
?>

Compare two strings (urls) for same domain

I'm trying to compare two urls using PHP, ensuring that the domain name is the same. It cannot be the sub-domain. It has to literally be the same domain. Example:
http://www.google.co.uk would validate as true compared to http://www.google.co.uk/pages.html.
but
http://www.google.co.uk would validate as false compared to http://www.something.co.uk/pages.html.
Use parse_url(), and compare the "host" index in the array returned from the two calls to parse_url().
Use parse_url()
$url1 = parse_url("http://www.google.co.uk");
$url2 = parse_url("http://www.google.co.uk/pages.html");
if ($url1['host'] == $url2['host']){
//matches
}
simple, use parse_url()
$url1 = parse_url('http://www.google.co.uk');
$url2 = parse_url('http://www.google.co.uk/pages.html');
if($url1['host'] == $url2['host']){
// same domain
}
You could use parse_url for this
$url1 = parse_url('http://www.google.com/page1.html');
$domain1 = $url1['host'];
$url2 = parse_url('http://www.google.com/page2.html');
$domain2 = $url2['host'];
if($domain1 == $domain2){
// something
}
Expanding the answer given by Ariel, the code you could use is similar to the following one:
<?php
compare_host('http://www.google.co.uk', 'http://www.something.co.uk/pages.html');
function compare_host($url1, $url2)
{
// PHP prior of 5.3.3 emits a warning if the URL parsing failed.
$info = #parse_url($url1);
if (empty($info)) {
return FALSE;
}
$host1 = $info['host'];
$info = #parse_url($url2);
if (empty($info)) {
return FALSE;
}
return (strtolower($host1) === strtolower($info['host']));
}

Problem with including files based on (non-defined) variable

i have a PHP site with the following code in it:
<?php
$p = $_GET['p']
include("$p.inc");
?>
Whenever I send a visitor to a page like index.php?p=contact for example I want the file contact.inc to be included. This works fine.
Now I want a certain file to be included (e.g. start.inc) when the visitor is sent to index.php without any GET variables. However, an error message is returned which tells me that $p is undefined (which it logically is).
I tried fixing this problem by using the isset function like so:
<?php
if(!isset($p)) $p = "start";
else $p = $_GET['p'];
include("$p.inc");
?>
but this doesn't work because now $p always contains the string "start" and I can't send the visitor to index.php?p=contact anymore - it will still include start.inc
Can somebody please help me with this issue?
Thanks in advance!
Explicitly specify the allowable values​​, obtained from outside.
<?php
$allowed_pages = array(
'home' => 'home.inc',
'contact' => 'contact.inc',
);
$page = #$_GET['p'];
$file = array_key_exists($page, $allowed_pages) ? $allowed_pages[$page] : $allowed_pages['home'];
include($file);
?>
You should white-list your pages anyway, for security. so:
<?php
$p = $_GET['p']
switch($p){
case 'contact':
include("contact.inc");
break;
default:
include("start.inc");
}
?>
Define your $p variable just like this:
$p = array_key_exists('p', $_GET) ? preg_replace('!\W!', '', $_GET['p']) : 'start';
you're checking $p instead of $_GET['p'] so, as $p is never set, you always land at starting page.
anyway you have to sanitize this variable first.
good practice would be like this (assuming pages stored in a "pagedata" folder and have .php extension):
if(isset($_GET['p'])) {
$p = basename($_GET['p']);
} else {
$p = "start";
}
$fileName = "pagedata/$p.inc.php";
if(is_readable($fileName)) {
include($fileName);
} else {
include("pagedata/404.html");
}
You should prefer an array-map or a switch like Nanne suggested.
At the very least use basename() if you want to keep using the $p variable directly in the include statement. And this is how you could avoid the "error" (which is a debug notice, btw):
<?php
$p = #$_GET["p"] or $p = "start";
$p = preg_replace("/\W+/", "", $p); // minimum filtering
include("./$p.inc");
?>
Thanks to you all!
I combined most of your suggestions to the following piece of code:
<?php
$pages = array(
'start'=>'Start.inc';
'contact'=>'Contact.inc';
'about'=>'About.inc';
};
$p = array_key_exists(#$_GET['p'], $pages) ? preg_replace('!\W!', '', $_GET['p'] : 'start';
$p = ucfirst($p);
$page = "./$p.inc";
if(is_readable($page)) include($page);
else include(./404.);
?>
I particularly like the array-map (as suggested by Alex and mario) for security reasons aswell as the error page idea by Col. Shrapnel.

Extract part from URL for a query string

I need a certain part of a URL extracted.
Example:
http://www.domain.com/blog/entry-title/?standalone=1 is the given URL.
blog/entry-title should be extracted.
However, the extraction should also work with http://www.domain.com/index.php/blog/[…]as the given URL.
This code is for a Content Management System.
What I've already come up with is this:
function getPathUrl() {
$folder = explode('/', $_SERVER['SCRIPT_NAME']);
$script_filename = pathinfo($_SERVER['SCRIPT_NAME']); // supposed to be 'index.php'
$request = explode('/', $_SERVER['REQUEST_URI']);
// first element is always ""
array_shift($folder);
array_shift($request);
// now it's only the request url. filtered out containing folders and 'index.php'.
$final_request = array_diff($request, array_intersect($folder, $request));
// the indexes are mangled up in a strange way. turn 'em back
$final_request = array_values($final_request);
// remove empty elements in array (caused by superfluent slashes, for instance)
array_clean($final_request);
// make a string out of the array
$final_request = implode('/', $final_request);
if ($_SERVER['QUERY_STRING'] || substr($final_request, -1) == '?') {
$final_request = substr($final_request, 0, - strlen($_SERVER['QUERY_STRING']) - 1);
}
return $final_request;
}
However, this code does not take care of the arguments at the end of the URL (like ?standalone=1). It works for anchors (#read-more), though.
Thanks a ton guys and have fun twisting your brains. Maybe we can do this shit with a regular expression.
There's many examples and info for what you want at:
http://php.net/manual/en/function.parse-url.php
That should do what you need:
<?php
function getPath($url)
{
$path = parse_url($url,PHP_URL_PATH);
$lastSlash = strrpos($path,"/");
return substr($path,1,$lastSlash-1);
}
echo getPath("http://www.domain.com/blog/entry-title/?standalone=1");
?>

Categories