I want to build a image website like this
ex.com/en-us/lion-wallpapers.html
ex.com/en-gb/lion-wallpapers.html
ex.com/en-in/lion-wallpapers.html
and i want to use single image path to be go through in multiple urls
using routing
images/lion/lion.jpg
.....is google only crawls single image with urls ???
or am i need to maintain individual directories and make them to copy on each directory?? so suggest me some best practises and URL srtructure for this ...
For URL structure like this your better off using a framework that has support for routing and then building your website in a Model-View-Controller(MVC) pattern.
This is quite abstracted architectural stuff so I can't really give specific examples.
Any framework that supports routing would work but an example is Zend Framework: http://framework.zend.com/
Here's their documentation on routing: http://framework.zend.com/manual/1.12/en/zend.controller.router.html
With routing you would be able to one lion-wallpapers.html page that then depending on the URL changes the language on the page.
In terms of how google will crawl your site, it will crawl your websites using links from your first page and anything you add to a sitemap. You can help customise how it crawls your website by adding your site to Google Webmaster tools here: https://www.google.com/webmasters/tools/home?hl=en
Related
We have a good old website where all the pages are static and we are updating them by editing the HTML files. Every redesign is a lot of work for us since we copy and paste all the texts from old to new design. If we need to change any single thing about design, we need to change it in all other pages one by one.
Now we have developed a site by using Codeigniter and want to replace that static pages with database-driven pages. We will put all the content to the database so we will have just one product page and retrieve any product info according to the query string.
The problem is we don't want to change the old urls with the new Codeigniter urls because of search engine ranking concerns.
URLs in the old website are like these:
example.com/code1001.php
example.com/code1002.php
example.com/very-good-product-1003.php
example.com/brand-new-product.php
example.com/product-listing.php
The Codeigniter URLS in the new website are like these:
example.com/products/details/code1001
example.com/products/details/code1002
example.com/products/details/code1003
example.com/products/listing
After a quick research, I thought that I can use CI's routes.php to display database-driven pages when user/google hit to the old URL.
I can use the below code to do that.
$route['code1001.php'] = "products/details/code1001";
$route['example.com/very-good-product-1003.php'] = "products/details/code1003";
$route['brand-new-product.php'] = "products/details/1004";
$route['product-listing.php'] = "products/listing";
I don't want this to be a 301 or 302 redirect, I just want to replace the static pages with the database driven pages and leave the urls same.
Do you think that this will have a negative effect on SEO? Is Google able to notice this routing?
Best Way For dynamic routing is
$route['confirm_registration/(:any)']= "login/confirm_registration/$1";
For redirect you can use .htaccess
Manually you can make PHP pages in your directory.
e.g.
Index.php
About.php
Contact.php
But with PHP frameworks like Laravel the pages do not exist in a file, they are in the database and are called when the user visits the page.
e.g.
If a person visits http://mywebsite.com/contact , the framework will look in the database for a page named 'contact' then output it to a user.
But how does Google (or other search engines) find those pages if they only exist in the database?
Google can index these fine as they are "server-side" generated. Files do not need to exist for Google to be able to index them, just exist at the server-side level.
Where Google has issues indexing is if your site is "client-side" based and uses something like AJAX to pull the content into the browser. A search engine spider can't execute JavaScript so they never find the content. However, Google has defined some guidelines for people to get this content indexed in their Web Masters Guide.
You have a static website address www.domain.com and that is real so once google come to know that there is a website named www.domain.com it will visit the site, now that google crawler is on your website it will look out for the links available on the home page of www.domain.com and hence they will be crawled. Thats simple
In Laravel, pages DO NOT exist in database, although they might be dynamically generated.
As pointed #expodax ,
Google will index LINKS for your web app, and links (URIs) are geneated in accordance with your routes.php file (found in app/Http/routes.php)
In essence, Google will index links / URIs available for the end user, it DOES NOT depend upon how you've organized files in your web app.
For detailed documentation about Routes in Laravel (how they can be generated or used) please check this.
http://laravel.com/docs/5.0/routing
A sitemap is a file where you can list the web pages of your site to tell Google and other search engines about the organization of your site content. Search engine web crawlers like Googlebot read this file to more intelligently crawl your site. more info
If you want to generate a sitemap for your laravel application you can do it manually or you can use a package like this: https://github.com/RoumenDamianoff/laravel-sitemap
I'm building some kind of social-network site using PHP and mysql (and quite new to web development..). I want user profiles to be accessible trough the URL address as "http://www.example.com/username".
It is also very important to me that each profile page will be indexed by search engines.
How do I do this?
You need to use .htaccess - mod_rewrite to be more specific
You can find many resources online for this.
I need some advice on where and how I should start this project. The current site is running classic ASP but uses .php extensions in the file system (don't ask - previous freelance).
Current URL structure has sub folders that contains display pages that renders the page
team/team_detail.php?teamID=3&Source=Title&Title=Partner
Only the teamID part actually does anything and query the data in a team table.
I was thinking of using the IIS url rule and add user friendly rule for each matching URL patterns. I was planning to change all website code to use new structure:
url team/team_detail/3/bob-titans
Old url formant that will still work team/team_detail.php?teamID=3&name=bob-titans
Would this complete site wide URL change be the most effective way to do this or would there be another more effective method?
If your main objective is to change the URL structure of the site (and clean up stuff like the unused "Source" and "Title" parameters while you're at it), then the approach is sound:
Modify the site code to use the new URL strategy (for example, with ASP.NET Routing).
Use the IIS URL Rewrite Module to create rules to properly redirect or rewrite the old-style requests to the new format. If this is a public site, and you are concerned about SEO, you should consider making these redirects use the 301 response type to tell search engines the resources under the old URLs have moved permanently to the new URLs.
It's a nice approach because you can focus on updating your code for the new structure, and let IIS take care of any users still referencing the old URL formats.
Is it possible to build a single-page website with silverstripe? Is it a bad idea - in sight of silverstripe?
How would a template look like to load all articles from several categories and subcategories in silverstripe?
Thanks for any links to tutorials, code-snipets, etc.
I have built backbone.js 'apps' with SilverStripe. You load the front end resources as you would any other SS project. I prefer not to use SS template vars to include assets (js/css), as js applications require strict placement guidelines (although it is possible to load everything using the Requirements class).
Check out the RESTful server docs for examples of usign the SilverStripe CMS to control your data, and use API endpoints to access this data from the front end. In fact, there is no need to have the SS backend and the single page app front end on the same server in this scenario, which is the beauty of REST.
For link handling in single page apps, you can use PushState or listen for hash change events. Either way, the front end JS handles routing, rather than usign SS's Director class.
Please provide more details on what you are trying to do if you'd like more specific help.