I have created a website to display some teacher's profile. Every profile page generate with more dynamic information from database. In my home page I have listed every teachers' profile from database with a link to his complete profile. Eg. index.php?teacherId=21. Now I need to convert from my dynamic pages to basic static HTML pages to get some benefits for my website. Number one is get search engine higher rank to my profile pages. further I need to rename html page with teacher's name. As well as my website has a searching option. It help for users to search teacher profile according to their subjects, grades, town, city, etc. So Can I know, it is possible to create such a searching system to my website after I made my php page to basic HTML pages...
any ideas are greatly appreciated.
What you're looking for is friendly URL: SEO Friendly URL
You make URL's like:
/teachers/21/name-of-teacher
Which then internally maps to:
/profile/teachers/index.php?id=21
The ID at the beginning is what you use to load from MySQL; just like how SO does it really :)
Investigate MVC frameworks for PHP. These will give you your search engine friendly urls.
Two that spring to mind are :
Zend
Cake
There are many more, like this list for starters!
Related
first post here.
I am making a site for someone who wants to embed their own youtube links. I am relatively new to web development and this is my first commission so I was wondering how I could hand over the site to the client and allow them to upload their links without me having to edit the mark up for him.
I understand how to use an '<iframe' youtube link in HTML to embed videos and am considering creating a database using PHPmyadmin that uses a form that the client can fill in with the iframe link whenever he needs to embed a new video.
I understand if that sounds like a convoluted way of doing this but if anyone knows a more intuitive way to solve this issue please let me know :)
I think it can be done using a database, may be MySQL, to keep it simple and a simple backend programming language, PHP would do.
You create an admin page just to collect the URLs from the admin. A nice little login page which stores admin username and password in a table. And you can use this module to assign roles to admin, add an admin or delete one.
Inside the module created above, create a page with a form to fetch and store URLs in the database. If you are going to categorize the videos, use suitable columns. Say you are going to categorize based on the genre and length, the table in your database should contain genre and length so that you can use them as filters later on.
Use cookies to create session for the admin. This is to avoid redirecting him to the login page after every single reload.
For all other user who is not admin, do not redirect to the admin page. Redirect them to a common page where they can search and enjoy the videos that the admin has added.
Again, like mentioned in many comments above, wordpress does it all. You have an option to use it too, it's free.
I am currently working on a eCommerce style project that uses a search engine to browse 7,000+ entries that are stored in a database. Every one of these search results contain a link to a full description page. I have been looking into creating clean/slug URLs for this, my goal is if a user clicks on some search result entry the browser will navigate to a new page using the slug URL.
www.mydomain.com/category/brown-fox-statue-23432323
I have a system in place to convert a string / sentence into URL form. However, it is not clear to me what the proceeding steps are once these URL's are created. What is the general plan for implementing this system? Do the URL's need to be stored in a database? Am I suppose to be using post or get data from the search result page to create content in these full description urls?
I appreciate any suggestions!
Many thanks in advance!
Each product has a unique url associated with it in the database.
When you perform a search you just return the correct unique url.
That way you only ever work out what the url should be once, when the product is first added and that url will always relate to that one product. This is the stage you use your system to create that url
Maybe you can enlighten us as to if you are using a framework? Some frameworks (like Zend) have ini / xml files for routing. But you will still need to store the urls or at least the article slugs in a db.
Storing the entry urls in the db after they have been "searched" is necessary because you want slugs to stay the same for entries. This allows for better caching / SEO which will improve your sites usability.
Hope that helps!
Edit: Saw your question about pulling up individual articles. You will have to start by setting up a relation between your entries to urls in your database. Create a url table with url_id, and url. Then place url_id on the entry table. Then whenever someone goes to any URL search the url table for the current url, recall the url_id, and then pull the entry. At that point its just styling the page to make it look the way you want.
A common approach is to have a bijective (reversible) function that can convert a "regular" URL into a user-friendly URL:
E.g.:
www.mydomain.com/category/brown-fox-statue-23432323
<=>
www.mydomain.com/index.php?category=brown-fox-statue-23432323
Then you need not keep record of this mapping (convention vs. configuration).
Search StackOverflow for "User Friendly URL Rewriting" for information on how to achieve this automatically with Apache. This question is a good starting point.
Now I have my website built on PHP & Mysql. Consider this like a forum. Now when a user posts a reply in my website 1 (ex. www.website1.com), I want to be able to show the starting thread and it's related replies in a sister website of mine. I want to do this in a way that it does not show the rest of the page & other page contents (like logo etc.). I don't think iframe would be a solution because an iframe would embed the whole page and the users visiting my sister website (totally different domain i.e. www.website2.com) would be able to see all the page contents, like logo etc. I want to avoid that. I want to make them see only limited information from website 1 and only the info. that I intend.
I hope that makes sense. In a way, you could say that I am trying to replicate my 1 website, and show only a limited part of it. Users browsing 2nd website can post a reply in the 2nd website and it should automatically be posted & visible to the visitors of the website 1. Users of website 1 should not know that a user of website 2 has posted it. They would feel that some user from website 1 has posted it. Do I have to use 2 separate mysql DB or just 1? I think it would be problematic if I am trying to use different DB. I also feel I might have to face DB connectivity issues as I can connect to only 1 DB at a time.
It's basically like users of website1.com should feel that they are replying to users of website1.com & users of website2.com should feel that they are replying to users of website2.com. (I need it this way to bridge the gap between them). At the same time I want to make the front end of the websites different so that they don't feel that they are replying to some other users outside the domain. These websites would be under my control and I will have access to the source code at any time. If I need to change the source code, these changes are welcome.
Is this really possible?
Thank you in advance.
I'd recommend generating RSS (might be runtime) and using it on the sister website. If RSS is not suitable for your needs, you can create your own XML-based format (or any other :) )
Make two forums which use one database. Both websites would put new messages in the same database.
Make an API for website1, so that website2 can retrieve and post messages on the forum. Website2 would do a HTTP request to website1, which returns XML or JSON, so that website2 can request a list of posts that it can display.
Have both sites connect to the same database and display the content they pull in whatever way is appropriate for the particular site. Each site can only pull the fields relevant to that site.
If the idea is to have two websites with the same data but different presentations, then you would want to simply share a single database between them - assuming they are hosted in the same place and can both get at the database.
You can then just create different PHP pages that both access the same database in the same way but display the data differently.
The best way to do this would be to have a shared library of functions or classes that both sites use to manipulate the data. You would then build a different "presentation layer" on top of that for each site.
I am creating a classifieds website.
Im storing all ads in mysql database, in different tables.
Is it possible to find these ads somehow, from googles search engine?
Is it possible to create meta information about each ad so that google finds them?
How does major companies do this?
I have thought about auto-generating a html-page for each ad inserted, but 500thousand auto-generated html pages doesn't really sound that good of a solution!
Any thoughts and idéas?
UPDATE:
Here is my basic website so far:
(ALL PHP BASED)
I have a search engine which searches database for records.
After finding and displaying search results, you can click on a result ('ad') and then PHP fetches info from the database and displays it, simple!
In the 'put ad' section of my site, you can put your own ad into a mysql database.
I need to know how I should make google find ads in my website also, as I dont think google-crawler can search my database just because users can.
Please explain your answers more thoroughly so that I understand fully how this works!
Thank you
Google doesn't find database records. Google finds web pages. If you want your classifieds to be found then they'll need to be on a Web page of some kind. You can help this process by giving Google a site map/index of all your classifieds.
I suggest you take a look at Google Basics and Creating and submitting SitemapsPrint
. Basically the idea is to spoon feed Google every URL you want Google to find. So if your reference your classifieds this way:
http://www.mysite.com/classified?id=1234
then you create a list of every URL required to find every classified and yes this might be hundreds of thousands or even millions.
The above assumes a single classified per page. You can of course put 5, 10, 50 or 100 on a single page and then create a smaller set of URLs for Google to crawl.
Whatever you do however remember this: your sitemap should reflect how your site is used. Every URL Google finds (or you give it) will appear in the index. So don't give Google a URL that a user couldn't reach by using the site normally or that you don't want a user to use.
So while 50 classifieds per page might mean less requests from Google, if that's not how you want users to use your site (or a view you want to provide) then you'll have to do it some other way.
Just remember: Google indexes Web pages not data.
How would you normally access these classifieds? You're not just keeping them locked up in the database, are you?
Google sees your website like any other visitor would see your website. If you have a normal database-driven site, there's some unique URL for each classified where it it displayed. If there's a link to it somewhere, Google will find it.
If you want Google to index your site, you need to put all your pages on the web and link between them.
You do not have to auto-generate a static HTML page for everything, all pages can be dynamically created (JSP, ASP, PHP, what have you), but they need to be accessible for a web crawler.
Google can find you no matter where you try to hide. Even if you can somehow fit yourself into a mysql table. Because they're Google. :-D
Seriously, though, they use a bot to periodically spider your site so you mostly just need to make the data in your database available as web pages on your site, and make your site bot-friendly (use an appropriate robots.txt file, provide a search engine-friendly site map, etc.) You need to make sure they can find your site, so make sure it's linked to by other sites -- preferably sites with lots of traffic.
If your site only displays specific results in response to search terms you'll have a harder time. You may want to make full lists of the records available for people without search terms (paged appropriately if you have lots of data).
First Create a PHP file that pulls the index plus human readable reference for all records.
That is your main page broken out into categories (like in the case of Craigslist.com - by Country and State).
Then each category link feeds back to the php script the selected value regardless of level(s) finally reaching the ad itself.
So, If a category is selected which contains more categories (like states contain cities) Then display the next list of categories. Else display the list of ads for that city.
This will give Google.com a way to index a site (aka mysql db) dynamically with out creating static content for the millions (billions or trillions) of records involved.
This is Just an idea of how to get Google.com to index a database.
How can I make it so that content from a database is available to search engines, like google, for indexing?
Example:
Table in mysql has a field named 'Headline' which equals 'BMW M3 2005'.
My site name is 'MySite'
User enters 'BMW M3 2005 MySite' in google and the record will show up with results?
Google indexes web pages, so you will need to have a page for each of your records, this doesn't mean to say you need to create 1,000 HTML pages, but following my advice above will dynamically / easily provide a seemingly unique page for each product.
For example:
www.mydomain.com/buy/123/nice-bmw-m3-2005
You can use .htaccess to change this link to:
www.mydomain.com/product.php?id=123
Within this script you can dynamically create each page with up-to-date information by querying your database based on the product id in this case 123.
Your script will provide each record with it's own title ('Nice BMW M3 2005'), a nice friendly URL ('www.mydomain.com/buy/123/nice-bmw-m3-2006') and you can include the correct meta information too, as well as images, reviews etc etc.
Job done, and you don't have to create hundreds of static HTML pages.
For more information on .htaccess, check out this tutorial.
What ILMV is trying to explain is that you have to have HTML pages that Google and other search engines can 'crawl' in order for them to index your content.
Since your information is loading dynamically from a database, you will need to use a server-side language like PHP to dynamically load information from the database and then output that information to an HTML page.
You have any number of options for how to accomplish this specifically, ILMV's suggestion is one of the better ways though.
Basically what you need to do first is figure out how to pull the information from the database and then use PHP (or another server-side language) to output the information to an HTML page.
Then you will need to determine whether you want to use the uglier, default url style for php driven pages:
mysite.com/products.php?id=123
But this url is not very user or search engine friendly and will result in your content not being indexed very well.
Or you can use some sort of URL rewriting mechanism (mod_rewrite in a .htaccess file does this or you can look at more complex PHP oriented solutions like Zend Framework that provide what's called a Front Controller to handle mapping of all requests) to make it so that your url's look like:
mysite.com/products/123/nice-bmw-m3-2006
This is what ILMV is talking about with regard to url masking.
Using this method of dynamically loading content will allow you to develop a single page to load the information for a number of different products based on the Id thus making it seem to the various search engines as though you have a unique page for each product.