Will this get Search Engines to index my content? - php

I have a classifieds website...
The classifieds are displayed in a dynamic php page.
For instance, if one searches for "bmw m3" in the form (which is in index.html) and submits, then a php page will appear showing results.
The php page called results.php, connects to mysql databases and fetches the information, and puts them in a table, and then outputs the table using a simple echo command.
<body>
echo $table;
</body>
The classifieds shown in the table above, are all just "headlines", so when clicking one classified to view all details, then another php page opens called ad.php.
Here I fetch the ad_id and then fetch all details from MySql, and show to the users.
Now, if somebody doesn't like filling out forms (for example google crawlers, or also users), I have all categories on my site at the bottom of index.html. Users may click these and then the results.php page appears, showing all results in that category.
Offcourse I also have a sitemap which consists of all classifieds, and is dynamic.
My problem is, so far, none of my classifieds have been indexed.
My question is, is a sitemap.xml with all classifieds in it, as well as a link to the results.php page for each category enough for Search Engines to index the classifieds?
What else do I need to do?
FYI: I use mod rewrite to rewrite urls, so in this case I have a rule which makes the original link to details of a classified:
www.domain.com/ad.php?ad_id=bmw_m3_249244
INTO:
www.domain.com/ads/bmw_m3_249244
And it is the rewritten URL I have in my sitemap.
Thanks

Have you submitted your sitemap.xml to the search engines? or specified it in the robots.txt?
Link below explains.
http://en.wikipedia.org/wiki/Sitemaps

From what you have described , Google shouldnt have any problem indexing your pages. There are maybe other things involved. For instance if the links to the category pages and classified pages are done with Javascript, google will not index that. How long have your pages been online? Does the main page show up in google search?

Related

For Google crawling purposes: Single PHP pull-page, or individual pages for each different item?

I am creating a site and want to have individual pages for each row in a database table. The information on each page is fairly useful and comprehensive, and it would be really nice if Google could index them.
My initial thought was to just create a single PHP template page and pull the correct information for whatever the user is looking at, but my fear is that search engines won't be able to index all of the pages.
My second thought was to batch-create/automate the process of creating the individual pages as html files (for the 2000+ rows in the table), because then I would be guaranteed that they'd be crawled. However, if I ever needed to make a change to the design, I'd have to re-process them all. Kind of a pain...
My final consideration was to just pick a page in my site and list all of the possible php pages in a hidden div, but I wasn't sure if search engines can index from that. I assume they just pull from the HTML, so it'd be able to find it, right?
Any suggestions? I would love it if I can just create a single page that populates based on what they user clicks, but I want them to be indexed.
Search engines can index dynamic pages so using one PHP file to create thousands of unique product pages will be fine for SEO. After all, each page/product will have a unique URL and will be seen as a unique page as a result. All you need to do is link to your product pages within your website and/or submit an XML sitemap so you can be sure they are found and indexed.
By linking your pages, I literally mean link to your product pages. Search engines find new content primarily through following links. So if you want your product pages to be found you need to link to them. Using form based search is not a good way to do it as search engines generally don't play to well with forms. But there are lots of way to make links to your pages including HTML sitemaps and product category pages which then can link to products in that category. Really, any way yo u an get a link to your product pages is a good way to help ensure they are found by the search engines.
You don't have to post links on invisible DIV!
Just create the page and have parameterized content fetching.
You can include the pages in the XML sitemap and submit to Google or you can include your page urls in the HTML sitemap too.

How to make search engines index search results on my website?

I have a classifieds website.
It has an index.html, which consists of a form. This form is the one users use to search for classifieds. The results of the search are displayed in an iframe in index.html, so the page wont reload or anything. However, the action of the form is a php-page, which does the work of fetching the classifieds etc.
Very simple.
My problem is, that google hasn't indexed any of the search results yet.
Must the links be on the same page as index.html for google to index the Search Results? (because it is currently displayed in an iframe)
Or is it because the content is dynamic?
I have a sitemap which works, with all URLS to the classifieds in the sitemap, but still not indexed.
I also have this robots.txt:
Disallow: /bincgi/
the php code is inside the /bincgi/ folder, could this be the reason why it isn't being indexed?
I have used rewrite to rewrite the URLS of the classifieds to
/annons/classified_title_here
And that is how the sitemap is made up, using the rewritten urls.
Any ideas why this isn't working?
Thanks
If you need more input let me know.
If the content is entirely dynamic and there is no other way to get to that content except by submitting the form, then Google is likely not indexing the results because of that. Like I mentioned in a comment elsewhere, Google did some experimental form submission on large sites in 2008, but I really have no idea if they expanded on that.
However, if you have a valid and accessible Google Sitemap, Google should index your classifieds fine. I suggest to use the Google Webmaster Tools to find out how Google treats your site and to diagnose any potential problems with crawling.
To use ebay is probably a bad example as its not impossible that google uses custom rules for such a popular site.
Although it is worth considering that ebay has text links to categories and sub categories of auction types, so it is possible to find auction items without actually filling in a form.
Personally, I'd get rid of the iframe, it's not unreasonable when submitting a form to load a new page.
that question is not answerable with the information given, to many open detail questions. if you post your site domain and URLs that you want to get indexed.
based on how you use GWT it can produce unindexable content.
Switch every parameters to GET
Make html links to those search queries on "known by Googlebot" webpages
and they'll be index

I have an iframe which content I need Google to index. Is this possible?

I have a classifieds website.
The index.html has a form:
<form action="php_page" target="iframe" etc...>
The iframe displays the results, and the php_page builds the results for the iframe. Basically the php_page builds a table containing the results from a mysql db, and outputs it.
My problem is that this doesn't get indexed by google.
How can I solve this?
The reason I used an Iframe in the first place was to avoid page-reloading when hitting submit.
Ajax couldn't be used due to various reasons I wont go into here.
Any ideas what to do?
Thanks
UPDATE:
I have a sitemap with URLS to all the classifieds also, but I don't think this guarantees google to spider those URLS.
Trying to make the google spider crawl the results of a search form is not really the right approach.
Assuming you want google.com users to find your classifieds ads by searching google, the best approach is to create a set of static html pages from the ads, and link them (not invisibly) from elsewhere on your site (probably best from the home page - but such a link can be in a footer or something else unobtrusive)
They can also be linked to from your sitemap XML (you do have a sitemap XML file don't you?)
Note: the <iframe> doesn't really come into this. Or Ajax.
There is no way to make any webspider fill out and submit forms.
Workaround: Every night, create a dump of the database and save the HTML to a file. Create a link from index.html to that file. Use CSS classes to make the link invisible. This way, Google will pick it up but users won't see it.

how can google find me if I am inside a mysql table?

I am creating a classifieds website.
Im storing all ads in mysql database, in different tables.
Is it possible to find these ads somehow, from googles search engine?
Is it possible to create meta information about each ad so that google finds them?
How does major companies do this?
I have thought about auto-generating a html-page for each ad inserted, but 500thousand auto-generated html pages doesn't really sound that good of a solution!
Any thoughts and idéas?
UPDATE:
Here is my basic website so far:
(ALL PHP BASED)
I have a search engine which searches database for records.
After finding and displaying search results, you can click on a result ('ad') and then PHP fetches info from the database and displays it, simple!
In the 'put ad' section of my site, you can put your own ad into a mysql database.
I need to know how I should make google find ads in my website also, as I dont think google-crawler can search my database just because users can.
Please explain your answers more thoroughly so that I understand fully how this works!
Thank you
Google doesn't find database records. Google finds web pages. If you want your classifieds to be found then they'll need to be on a Web page of some kind. You can help this process by giving Google a site map/index of all your classifieds.
I suggest you take a look at Google Basics and Creating and submitting SitemapsPrint
. Basically the idea is to spoon feed Google every URL you want Google to find. So if your reference your classifieds this way:
http://www.mysite.com/classified?id=1234
then you create a list of every URL required to find every classified and yes this might be hundreds of thousands or even millions.
The above assumes a single classified per page. You can of course put 5, 10, 50 or 100 on a single page and then create a smaller set of URLs for Google to crawl.
Whatever you do however remember this: your sitemap should reflect how your site is used. Every URL Google finds (or you give it) will appear in the index. So don't give Google a URL that a user couldn't reach by using the site normally or that you don't want a user to use.
So while 50 classifieds per page might mean less requests from Google, if that's not how you want users to use your site (or a view you want to provide) then you'll have to do it some other way.
Just remember: Google indexes Web pages not data.
How would you normally access these classifieds? You're not just keeping them locked up in the database, are you?
Google sees your website like any other visitor would see your website. If you have a normal database-driven site, there's some unique URL for each classified where it it displayed. If there's a link to it somewhere, Google will find it.
If you want Google to index your site, you need to put all your pages on the web and link between them.
You do not have to auto-generate a static HTML page for everything, all pages can be dynamically created (JSP, ASP, PHP, what have you), but they need to be accessible for a web crawler.
Google can find you no matter where you try to hide. Even if you can somehow fit yourself into a mysql table. Because they're Google. :-D
Seriously, though, they use a bot to periodically spider your site so you mostly just need to make the data in your database available as web pages on your site, and make your site bot-friendly (use an appropriate robots.txt file, provide a search engine-friendly site map, etc.) You need to make sure they can find your site, so make sure it's linked to by other sites -- preferably sites with lots of traffic.
If your site only displays specific results in response to search terms you'll have a harder time. You may want to make full lists of the records available for people without search terms (paged appropriately if you have lots of data).
First Create a PHP file that pulls the index plus human readable reference for all records.
That is your main page broken out into categories (like in the case of Craigslist.com - by Country and State).
Then each category link feeds back to the php script the selected value regardless of level(s) finally reaching the ad itself.
So, If a category is selected which contains more categories (like states contain cities) Then display the next list of categories. Else display the list of ads for that city.
This will give Google.com a way to index a site (aka mysql db) dynamically with out creating static content for the millions (billions or trillions) of records involved.
This is Just an idea of how to get Google.com to index a database.

How to Have Search Engines Index Database-Driven Content?

How can I make it so that content from a database is available to search engines, like google, for indexing?
Example:
Table in mysql has a field named 'Headline' which equals 'BMW M3 2005'.
My site name is 'MySite'
User enters 'BMW M3 2005 MySite' in google and the record will show up with results?
Google indexes web pages, so you will need to have a page for each of your records, this doesn't mean to say you need to create 1,000 HTML pages, but following my advice above will dynamically / easily provide a seemingly unique page for each product.
For example:
www.mydomain.com/buy/123/nice-bmw-m3-2005
You can use .htaccess to change this link to:
www.mydomain.com/product.php?id=123
Within this script you can dynamically create each page with up-to-date information by querying your database based on the product id in this case 123.
Your script will provide each record with it's own title ('Nice BMW M3 2005'), a nice friendly URL ('www.mydomain.com/buy/123/nice-bmw-m3-2006') and you can include the correct meta information too, as well as images, reviews etc etc.
Job done, and you don't have to create hundreds of static HTML pages.
For more information on .htaccess, check out this tutorial.
What ILMV is trying to explain is that you have to have HTML pages that Google and other search engines can 'crawl' in order for them to index your content.
Since your information is loading dynamically from a database, you will need to use a server-side language like PHP to dynamically load information from the database and then output that information to an HTML page.
You have any number of options for how to accomplish this specifically, ILMV's suggestion is one of the better ways though.
Basically what you need to do first is figure out how to pull the information from the database and then use PHP (or another server-side language) to output the information to an HTML page.
Then you will need to determine whether you want to use the uglier, default url style for php driven pages:
mysite.com/products.php?id=123
But this url is not very user or search engine friendly and will result in your content not being indexed very well.
Or you can use some sort of URL rewriting mechanism (mod_rewrite in a .htaccess file does this or you can look at more complex PHP oriented solutions like Zend Framework that provide what's called a Front Controller to handle mapping of all requests) to make it so that your url's look like:
mysite.com/products/123/nice-bmw-m3-2006
This is what ILMV is talking about with regard to url masking.
Using this method of dynamically loading content will allow you to develop a single page to load the information for a number of different products based on the Id thus making it seem to the various search engines as though you have a unique page for each product.

Categories