Protocol Host showing in our Error Log / 404 Errors - php

I'm getting an error showing up in my error logs over and over. I see it in an error log in cpanel as well as in an AW stats report.
The errors look like this:
/my_directory/'%20+%20protocol_host%20+%20'/images/greenthumb.png
/my_directory/'%20+%20protocol_host%20+%20'/images/imgsicon.png
I'm seeing this thousands of times a day.
the legit path in the example above would be something like this:
/my_directory/page.php?id=123454
(i.e www.my_site.com/my_directory/page.php?id=123454)
Any ideas what this protocol_host is referring to and why it would be hitting my error log so often?
My research before posting this question led me to something related to the search indexer on a windows operating system computer, but I can't see the connection.
Thanks in advance as always

I'm with #Shal. Although google gives you some results for 'protocol_host' I don't think that they are related to your problem. I guess it is either an error in your PHP code which generates the links or an erroneous client (or a hacker) is accessing your site

Looking at the error-URLs, it seems like the paths to some images are generated dynamically by concatenating strings; but the quotes are note set correctly.
/my_directory/'%20+%20protocol_host%20+%20'/images/greenthumb.png
without URL encoding is
/my_directory/' + protocol_host + '/images/greenthumb.png
PHP uses . to concatenate strings, but here a + is used. So I would guess that some JavaScript code causes this error.
Check your JS resources!

I would guess, that you are linking some resources relatively without a leading slash.
example:
<img src="images/foo.png" />
usually this leads to urls like http://example.com/images/foo.png instead of http://example.com/my_project/images/foo.png but can have other conseques as well.
This is just a guess though.

Related

object not found $_GET

i'm trying to learn PHP but for some reason i am getting this weird issue (i am using XAMPP):
i have a simple code
<?php
echo $_GET['name'];
?>
and when ever i type http://localhost/lee.phpi get undefined index like i am supposed to, but when i type http://localhost/lee.php&name=lee i get an Object not found The requested URL was not found on this server error.
does anybody know why is this happening? is it my code or my pc maybe?
http://localhost/lee.php&name=lee
Is an incorrect URL altogether, it should be
http://localhost/lee.php?name=lee
^
Now if you were to add a new parameter you would use an & only then. First one always comes after an ?, for example
http://localhost/lee.php?name=lee&age=20
I was going to refer you to HTTP Specifications documents but since you mentioned you are learning PHP i thought that might be too overwhelming for you.
And your code is fine.
Free Tip since you said you just started learning:
Always read into error messages and believe in what they say while you are investigating an issue, they are there really for a purpose. For example
The requested URL was not found on this server error.
That error message would mean that the URL is not there on the server, if i was you i would care less about my code at that point and more about what is the reason that URL is not there when my file is there? And that would have lead me to the conclusion that the URL format is wrong.
A lot of people overlook error messages even in their advanced learning stages and say no I have everything fine and the error message is weird, no it's not.

$_SERVER["HTTP_REFERER"] not returning full URL

I'm trying to apply a quick patch to address an issue with an extension we're using. As a result, please pardon this "bandaid-like" fix that I'm requesting assistance with. This is merely an effort to fix an issue in about 20 minutes or less and schedule in a permanent fix for later in the week.
That being said, I am struggling with grabbing a value that I would expect with using $_SERVER["HTTP_REFERER"]. Our URL is somewhat odd at the moment. A URL example is below...:
http://domain.com/custom-wheels-performance-tires/custom-wheels.html#/custom-wheels-performance-tires/custom-wheels.html?wheel_diameter=2663
When using $_SERVER["HTTP_REFERER"], the value I'm getting (for the URL above) is:
http://domain.com/custom-wheels-performance-tires/custom-wheels.html
Evidently, it is being cut off at the # in the URL. Common sense would be to remove that from the URL, but I'm going to have to dig into someone else's code to do that and it exceeds the time allocated for this patch. Is there a way to get the full URL (even if it isn't $_SERVER["HTTP_REFERER"])?
I appreciate any and all assistance!
Due to the way URL's are handled by browsers, the server never receives anything past the hash fragment identifier (#). The fragment is intended to be used by the browser to scroll a page to an anchor.
However, It is possible to utilize JavaScript to get the fragment, and send it to the browser.

Website optimization and random javascript errors in google chrome console

I've really big problem with my website: http://ap.v11.pl/sklep/
It loads really slow and I dont know how to fix that.
I've getting some weird errors from Chrome console: http://scr.hu/0an/xq5bz
There errors are random, for example i'm getting error that something cant be found but this resource exists and the paths are good.
My htaccess:
http://pastebin.com/ewZZBLFg
Page is working on ZendFramework 2
Thank you for any advices
My hypothesis is:
you are running Ghostery as Chrome plugin or something similar so that e.g. your browser will block a couple of your scripts like the adstat thing and google analyticis
your webserver has a problem sending the correct mime type for the javascript stuff. Check out this posting on the "resource interpreted as a ..." error message
It may be that only one frontend is not working correctly. This would explain why you get the errors not all the time.
In general your site is packed with scripts and images. The first page has > 250 requests & almost 4 Mb. That's very much and it takes time. Amazon's Frontpage has half the number of requests and something like 300kb.
You should check if you can reduce the number of requests - the yslow plugin may give you some good advise here. Can you reduce the image size and number of image? (css sprites?)
You should also check if you have to deliver all the images through your regular web browser or if you can use a lightweight alternative. Are you using NGINX? AFAIK it has good options for performance tunings.
Edit: As a starting point: http://gtmetrix.com/reports/www.ap.v11.pl/fBGKScZ6

PHP Javascript page loads twice

I have written this small site, with registration and everything, and I got to a point I think I am not too sure what is happening.
It first started as the DB reporting to me that the user I am trying to write into the DB has a duplicate entry (where it should be unique), which really puzzled me, how can it be that I have duplicate ? Well. It took me three days to realize that the page is being somehow called twice !
I put a
$_SESSION['one']=0;
and a
$_SESSION['two']=0;
in the topmost and bottommost parts of the page accordingly.
then I changed them both to ++, so I can check how many times they have been passed through.
I used the verification link from the emaill the site has sent, and tested their values.
Strangely enough, 'one' would equal 2 and 'two' would equal 1...
This explains exactly why all worked registration-wise, but I got all those errors about those duplicates.
Though, I used firebug to trace any redirections, but couldn't see anything...
It shows the page has 12 GETs and a POST.
I was hoping to bump into a redirect and debug accordingly, but alas, or maybe I don't know how to use firebug to trace these redirects...
I would appreciate any suggestion
Thanks in advance!
Like I said in the comments, here is the answer that worked for me and Ted:
What I do to fix it is change my doctype to html 5 and then validate the page using w3c validator, this problem was only occurring for my in firefox using firebug.
Do you have some <img> or <script> with src=""? Or maybe some <link> stylesheet with empty href?
Is firebug showing you, in net tab, that your site is called twice?
This kind of issues usually happen when you have a fatal or otherwise unignorable error that forces PHP to terminate you script early, in the midst of processing a request.
Check you error log for details of happening.
It is Firebug that is causing the page to load twice. Apparently changing the doctype to html5
as:
instead of html4 has overridden that bug.
When you are going to deploy your site, go back to html 4, and of course, always keep the server safe from such bugs. Use DB constraints, validation and escaping when needed.
Hope this helps, and Big thanks to #jeffreydev!!

Passing variable to Google Charts URL

This is probably something really simple, however I am quite new to PHP, and havent done any HTML in years.
I need to get a PHP variable filled with an array of figures into Google Charts. My code for this so far is:
<img src="http://chart.apis.google.com/chart?
&chs=340x175
&chd=t:<?=$filedetail[1]?>
&cht=lc
&chtt=Test
">
However, Google reports an error, as it stops at the ?=$filedetail[1] for some reason. It doesnt seem that reading the variable is the problem, more that the API simply cant read past the start of the PHP tags.
Thanks,
Rob A.
EDIT: I have managed to make Google accept the URL, however now it is not showing anything on the chart, as its filling in the &chd=t: field with instead of the figures within that variable.
The URL reads like this:
http://chart.apis.google.com/chart?&chs=340x175&chd=t:%3C?=$filedetail[1]?%3E&cht=lc&chtt=Test
If oyu say Google is complaining about the ?=$filedetail, chances are you are doing this in a file that is not being parsed by PHP, for example a file that ends with .html or .htm.
You can see whether this is the case by looking into the page's source code in the browser. If you see the PHP command in the source as you wrote it above, the PHP code was never executed.
The easiest way to fix that, if that's the problem, would be to switch to a .php file extension.
In URLs, literal & should be written as &
Edit: And you can't do ?&chs -- it should be ?chs. The line breaks are probably going to break the URL too...

Categories