Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 5 years ago.
Improve this question
I would like to know how I can make a webpage that will look good in most mobile phones.
For making normal webpages, I use dreamweaver cs3.
I don´t really want to use emulators unless I have too.
Can I not just center everything in the middle, with auto margins on both sides.
When I am looking at the pages that I have made sofar, they look ok,
but in a mobile phone some things get cluthered.
I am planning to strip out all the non essential information and markup for use on the mobile phone.
Then there is the question, basicly the most important one for me now
How to present a different page if the mobile phone go´s to the same address?
edit
I diddn´t get an answer om the last one, but I found a browser detection script that I could use.
thanks, Richard
In order to get the greatest amount of compatibility, you have to trade it off with ditching most of the bells and whistles browsers running on desktop computers consider run of the mill.
Declare the mobile doctype with your XHTML documents, and make sure your markup and styleheets are valid.
Keep you CSS real simple. No :hover, don't use images as part of the design, limit your usage of fixed sizes/margins.
Emulators aren't as effective for testing as the actual devices themselves. Phones that do Wifi/Bluetooth PAN can make life cheaper, but testing over carrier's network will also help you get a better understanding of the speed and latency.
There is no "typical device", but if you can get your website looking really good under webkit without using webkit's CSS addons, you've covered a huge chunk of devices (Nokia S60, iPhone, Android etc). Work with Opera Mini, and you'll expand that chunk even more.
Compact your output as much as possible. Not only are your end users going to be screen and CPU processing limited, they are most likely going to be limited by network. The faster you can push out your data to them, the less-sluggish your website will feel.
A List Apart has a great article about conditionally using different style sheets for mobile devices:
http://www.alistapart.com/articles/putyourcontentinmypocket/
You can also check out Apples Documentation on designing web content for the iPhone, although it's iPhone specific, it pertains to mobile devices in general:
http://developer.apple.com/iphone/library/documentation/AppleApplications/Reference/SafariWebContent/OptimizingforSafarioniPhone/OptimizingforSafarioniPhone.html#//apple_ref/doc/uid/TP40006517-SW1
Just my two cents:
Personally, I would make the layout fluid, so that it auto adjusts itself according to the length and breadth of the display.
Remember many phones have accelerometers which can change the direction of the page, when tiled sideways. In this case, horizontal scrolling is a big no-no.
Just keep all useless things aside and present only the basic information, advanced or details information should be places under More button.
Don't use Images, or I say - Use it as minimum as possible. This would take the most of the user's bandwidth. If you are targeting this for people under GPRS/EDGE , then better avoid Images even more. People on 3G have a better chance.
You can have a look at Facebook or Gmail mobile interface to have an idea. Google Reader's mobile interface is also good, but still not upto the mark.
I don't know how much people would agree with me - Keep AJAX low. Most phones can't handle so much load if your page is highly ajaxified. Remember, it is a mobile and not a computer. It does have some limitation. Probably very high end phones can render it, but to keep your userbase strong, minimise the use.
Content Load: If the page takes just to much time to load, then probably the user won't bother using it. They would prefer to use any other alternative/service which can get his/her work accomplished.
CSS: Make less use of CSS as much as possible. Use colours more than Images as I said many times above. You should make use of float to make the page fit properly on the screen. If you desire you can use smaller font - but beware, don't go below a certain level.
Related
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 3 years ago.
Improve this question
I need to crawl a website and detect how many ads are on a page. I've crawled using PHPCrawl and saved the content to DB. How can I detect if a web page has ads above the fold?
Well simply put: You can't really. At least not in a simple way. There is many things to consider here and all of them are highly subjective to the web page you are crawling, the device used, etc. I'll try to explain some of the main issues you would need to work around.
Dynamic Content
The first big problem is, that you just have the HTML structure, which on itself gives no direct visual information. It would be, if we were in like 1990, but modern websites use CSS and JS to enhance their pages core structure. What you see in your browser is not just HTML rendered as is. It's subject to CSS styling and even JS induced code fragments, which can alter the page significantly. For example: Any page that has a so called AJAX loader, will output as a very simple HTML code block, that you would see in the crawler. But the real page is loaded AFTER this is rendered (from JS).
Viewport
What you described as "above the fold" is an arbitary term, that can't be defined globaly. A smartphone has a very different viewport than a desktop PC. Also most of modern websites use a very different structure for mobile, tablet and desktop devices. But let's say you just want to do it for desktop devices. You could define an average viewport over most used screen resolutions (which you may find on the internet). We will define it as 1366x786 for now (based on a quick google search). However you still only have a PHP script and an HTML string. Which brings us the next problem.
Rendering
What you see in your browser is actually the result of a complex system, that incooperates HTML and all of the linked ressouces to render a visual representation of the code you have crawled. Besides the core structure of the HTML string you got, any resource linked can (and will) chanfge how the content looks. They can even add more content based on a variety of conditions. So what you would need to get the actual visual information is a so called "headless browser". Only this can give you valid informations about what is actually visible inside the desired viewport. If you want to dig into that topic, a good starting point would be a distribution like "PhantomJS". However don't assume that this is an easy task. You still only have bits of data, no context whatsoever.
Context, or "What is an ad?"
Let's assume you have tackled all these problems and made a script that can actually interpret all the things you got from your crawler. You still need to know "What is an ad?". And thats a huge problem. Of course for you as a human it's easy to distinquish between what is part of the website and what is an ad. But translating that into code is more of an AI task than just a basic script. For example: The ads could (and are most of the time) loaded into a predfined container, after the actual page load. These in turn may only have a cryptic ID set that distinguishes them from the rest of the (actually valid) page content. If you are lucky, it has a class with a string like "advertisment", but you can't just define that as given. Ads are subject to all sorts of ad blockers, so they have a long history of trying to disquise themselves as good as possible. You will have a REALLY hard time figuring out what is an ad, and what is valid page content.
So, while I only tackled some of the problems you are going to run into, I want to say that it's not impossible. But you have to understand that you are at the most basic entry point and if you want to make a system that is actually working, you'll have to spend a LOT of time on finetuning and maybe even research on the AI field.
And to come back to your question: There is no simple answer for "How to detect if a page has ads". Because it is way more complex than you might think. Hope this helps.
I am working on a mobile website (m.website.com) and I am wondering if there is a need to differentiate between different operating systems or mobile browsers?
For example, I am using a mobile detecting PHP class from https://github.com/serbanghita/Mobile-Detect and there are functions like isIOS() and isAndroid() or isChrome() and isSafari() as well as a general isMobile().
So the question is do I have to have different sets of webpages to accommodate different OS/browsers? Or all I have to do is to have a single set which falls under from the isMobile() function that all mobile OS/browsers will understand?
You do not need different pages to target different OS/Browsers per se ... but that is only half the answer. Because the OS/Browsers have their own quirks, you have to code your pages in such a way that you code is compatible with "most" browsers/OS. An exercise known as cross browser compatible coding. If you do not do such coding, e.g. by picking up some specific features/bugs/non standard implementation of one particular OS/browser then your page might break in some other browsers.
Most 'modern' browsers are HTML standard compatible. The cross browser compat guidelines are mostly about sticking with HTML standard; that is if you are NOT targeting something like IE6.
Web has many articles on cross browser dev guidelines. e.g. see http://www.htmlbasictutor.ca/cross-browser-compatible.htm
Normally it's enough to use isMobile and create one template for all mobile devices.
You need isIOS, etc only, if you want to create different user experiences or use os dependent javascript functions/libraries. An example would be creating an mobile app, which looks like an native app and is available for iOS/Android/etc..
I have never understood why some people say making custom css for each browser is a bad thing. To keep my page size down and download times fast it makes perfect sense to me to make a custom css for the major browsers (especially IE in its many different forms), and a general catch all css for everything else.
If you want to send out a bloated, huge, Swiss army knife of the css world, for all situations then go right ahead I'm not going to stop you.
Fast detection of the browser is important when doing this. Loading a JavaScript file to detect the browser seems slow. So I would prefer to use php to detect the browser, and send out the specified css. Or at least a general browser specific css then use the JavaScript to load a more detailed version of the css.
But I've read article after article about why this is a bad thing. The main reason behind each of these articles is because the user agent can be faked. Or there using Firefox but the server thinks they're using IE7 so it sends out the wrong css file.
As a developer/designer of web apps why is this my problem? If you want to use Firefox, but tell my server your using safari or IE*, and get a crappy looking page, why is it my problem?
And don't throw that whole if the user can't see your site right they'll never come back, or some kind of similar argument at me. a normal user isn't going to be doing this. its only going to be the people who know how to do this, and will know whats wrong when my site looks crappy.
This is similar to looking at my site on a old Apple II (I have no clue how), and yelling at me because everything looks green.
So is there a good reason, not a personal preference, why I shouldn't use php to detect the browser and send out customized css files?
I do this mostly for the different versions of IE. It just seems like for some sites, adding the if IE6 and if IE7 parts just double or triple the size of the css file.
Typically when a user intentionally fakes the user agent string, it is because something is not viewable in the user's browser that should be. For example, some sites may restrict users to IE or Firefox, but the user is using Iceweasel on Debian. Iceweasel is just a Firefox renamed for trademarked reasons (there are a few other changes also), so there is no reason that the site should not work.
Realize that this happens because of (bad) browser detection, not despite it. I would say you don't need to be terribly concerned about this issue. Further, if you can just make your site reasonably cross-browser compatible, it won't matter at all. If you really want to use browser-specific CSS, and you don't want to do so all in one CSS file, don't let a fake user agent stop you.
As long as the only thing you're doing is changing style sheets, there is no valid reason as far as I can tell. If you're attempting to deliver custom security measures by browser, then you'll have issues.
Not sure about php but in Rails it is normal and dead simple practice to provide css files and different layouts based on the user agent particularly when considering that your site is just as likely to be accessed by any of the myriad of available mobile devices, never mind writing for the most popular (Currently Firefox) browsers and even writing custom MIME types if need be is also dead simple.
IMO not doing so is pure laziness on the coders part but then not all sites are developed by professional teams of developers with styling gurus at hand. Also in languages other than Rails it might not be so simple. Sorry, I haven't a clue about PHP so this may not be an appropriate reply
In my opinion, starting with normalize.css, and having a base style sheet to start, overriding the base styles as needed usually works along with making sure you set appropriate fallbacks. If you really need it a few media queries, and feature detection can go a long way.
One reason you shouldn't base things off of the browser is because major search engines like Google and Yahoo prohibit displaying different content for different browsers. GoogleBot can detect different CSS and HTML and you may get bad search positioning. Additionally, if you use any advertising services you may be in breach of their contract by displaying varying content.
Let's say I have a plain HTML website. More than 80% of my visitors are usually from search engines like Google, Yahoo, etc. What I want to do is to make my whole website in Flash.
However, search engines can't read information from Flash or JavaScript. That means my web page would lose more than half of the visitors.
So how do I show show HTML pages instead of Flash to the search engines?
Note: you could reach a specific page/category/etc in Flash by using PHP GET function, for example: you can surf trough all the web pages from the homepage and link to a specific web page by typing page?id=1234.
Short answer: don't make your whole site in Flash.
Longer answer: If you show humans one view and the googlebot another, you are potentially guilty of "cloaking". If the Google Gods find you guilty, you will be banned to the Supplemental Index, never to be heard from again.
Also, doing an entire site in Flash breaks the basic contract of the web, namely that you can link to specific content from other sites or in emails. If your site has just one URL and everything else is handled inside of Flash ... well, I don't know what you have, but it isn't a website anymore. Adobe may like you, but many people will not. Oh, and Flash is very unfriendly to people with handicaps.
I recommend using Flash where it is needed (videos, animations, etc.), but make it part of an honest-to-God website.
What I want to do is to make my whole
website in Flash
So how to accomplish this: show HTML
pages instead of Flash?
These two seem a bit contradictory.
Important is to understand the reasoning behind choosing Flash to build your entire website.
More than 80 percent of my visitors
are usually from search engines
You did some analysis but did you look at how many visitors access your website via a mobile device? Because apart from SEO, Flash won't serve on the majority of these devices.
Have you considered HTML5 as an alternative for anything you want to do with Flash?
Facebook requires you to build applications in Flash among others but html, why? I do not know, but that is their policy and there has got to be a reason.
I have been recently developing simple social applications in Flash (*.swf) and my latest app is a website in flash that will display in tab of my company webpage in Facebook; at the same time, I also want to use that website as a regular webpage on the internet for my company. So, the only way I could find out to display html text within a flash file is by changing the properties for the text wherever I can in CHARACTER to "Render text as HTML", look for the symbol "<>". I think that way the search engines will be able to read your content and process your website accordingly. Good luck.
As you say that you can reach the Flash page by get variable using page ID or any other variables. So its good. I hope you will add Flash in each HTML page. Beside this, you can add all other HTML contents in hidden format. So the crawlers could reach the content and your site will look-up in Flash. Isn't it?
Since no-one actually gave you an straight answer (probably because your question is absolute face-palm-esque), i'll try:
Consider using the web-development approach called progressive enhancement. Now, it's fair to say that it probably wasn't intended for Flashification of a website, but you can make use of it's principles.
Start with your standard HTML version of your website
Introduce swfobject to dynamically (important bit) swap out the HTML content for it's Flash equivalent
Introduce swfaddress to allow for deep linking into your Flash movies (pseudo-URLs)
Granted, steps 2 and 3 are a little more advanced that how i've described them and your site size/structure/design may not suit this approach, but at least it's an answer.
All that being said, I agree with the other answers/comments about the need for using Flash to display your entire site - there's very very very few reasons anyone would do that, and there's more reasons than already added as to why not to (iOS devices etc)...
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 6 years ago.
Improve this question
So must of us have a lot of content on our sites in one language or another. Since we are web professionals we spent all that time we could have been learning human languages - instead learning computer languages. So we need someway to translate our content.
Google provides a translation service (among others) and so given their massive empire I am confident that they do (or shortly will) have the best translation service. With that in mind, what is the best way to use it? We could just be lazy and use the little widget that they provide - but we would lose all the content and SEO juice because google would rewrite the links to point to "translate.googleusercontent.com?translate=...".
So my question is - how we can use this service while retaining the translated content on our site?
One method would be to use the Google AJAX API to load the content inline when the wants it. But since it is powered by JS (like jQuery)- Search Engines won't benefit from this.
Another method would be to use a server side language (like PHP) to scrap the content from the google translate page. But I'm not sure this is 100% legal.
Finally, I was wondering about using mod_rewrite to fetch the page. But again, I don't think this would benefit our site.
RewriteRule ^(.*)-fr$ http://www.google.com/translate_c?hl=fr&sl=en&u=http://site.com/$1 [R,NC]
RewriteRule ^(.*)-de$ http://www.google.com/translate_c?hl=de&sl=en&u=http://site.com/$1 [R,NC]
RewriteRule ^(.*)-es$ http://www.google.com/translate_c?hl=es&sl=en&u=http://site.com/$1 [R,NC]
RewriteRule ^(.*)-it$ http://www.google.com/translate_c?hl=it&sl=en&u=http://site.com/$1 [R,NC]
All you would need to do is add a a couple links on your pages with the variables “-fr” appended to the end of what ever URL is in the link and your set.
//View file
View Page in German
Does anyone have any thoughts on this?
:EDIT:
After reading google's Terms of Service it seems that
You will not, and will not permit your
end users or other third parties to:
incorporate Google Results as the
primary content on your Property or
any page on your Property; submit any
request exceeding 5000 characters in
length;
Which sounds to me like you can't use the google translate URL to translate the main content - with PHP or AJAX - if that content is the main post of the page. Now how does this work? Why would you build a translation API and then not allow it to be used on the main page content?
Well, you should read the EULA, maybe google doesn't want you to use their service in that way.
Not to mention that Google Translate may be fine across indo-european languages, but right now, translations to other families of languages really suck, and generate comical, meaningless text (e.g. my own language, Hungarian, is a nightmare for Google). I don't think it'll advance to an at least usable level in the near future.
I think the most SEO friendly way to decide what language to display is to look at the Accept-Language request header, although language flag icons wouldn't be a bad idea either, in case someone using an en-us browser feels more comfortable reading French, for example.
It looks like there is an (unofficial) API for php to translate using Google translate. It appears to be unofficial, but it's hosted on Google code, so if it's something that Google didn't want, it would probably be gone by now.
You should make sure to cache the translated pages though.
http://code.google.com/p/gtranslate-api-php/
To have a real multilingual site, automated translations are not and will not be a good enough solution. On my site, I've added an interface allowing easy human translation and Google translate (as well as babelfish) is used for suggesting translations before a real human does the actual translations. Check the project at http://transposh.org/ is your site is on WordPress
The quality of SEO Translate is still questionable. Given that it is based on statistical translation, in the long run it will improve, but today it's outright dangerous. I would not use it for my site - as I pointed out in one of my last posts on my blog about the impact on the new Google algorithm on website translation, the latest Google Panda algorithm update penalizes spelling and grammar errors, so machine translation might ultimately penalize you.
After more research, apparently google does expose the JSON URL to make direct requests - so using a server side language does seem to be an option (as long as they are cached). However, once you get that content you still need to figure out how to allow users to access it in the flow of your current app. Perhaps something like the mod_rewrite method mentioned above?
You can translate text through the google language api's REST interface.
Here is a PHP library that does it:
http://code.google.com/p/php-language-api/
A simple example is on the project page.