Social Network & Dynamic vs Static pages. Where to draw the line? - php

As I come to the end of my project I am starting to wonder if I made it too dynamic. I have designed this social networking site and 90% of it is based on JQuery. It looks nice, it loads fast but I started to wonder if it is too dynamic...
My concern is that basically once you log in, 95% of what you do is JQuery based therefore the user never leaves the same URL. If this is true, how is a search engine like Google supposed to index my website?
Is this the part where I ask myself what parts of the site I want to be indexed and make them static pages instead?
Basically it has occurred to me that if when you browse my site for user profiles, these profiles are displayed to you through JQuery requests, then it is safe to assume that these profiles can never be found in a Google search, because the Google spider would never see it. Is this true?
Thank you for any thoughts on this,
Vini

Make your site work in both "modes". For example, I'm on my dashboard and I want to check out my friend Joe's profile, there should just be an A tag with the href set to something like "/profiles/joe".
Now, onDomReady, when the page loads, run your javascript to go through the links and attach click handers to those links, and load the profile dynamically using your existing jQuery style.
This development style is called "progressive enhancement" and allows both search engines and human accessibility devices to work better with your website. Check it out.

Related

How should I build a website for someone else who wants to embed their own Youtube links?

first post here.
I am making a site for someone who wants to embed their own youtube links. I am relatively new to web development and this is my first commission so I was wondering how I could hand over the site to the client and allow them to upload their links without me having to edit the mark up for him.
I understand how to use an '<iframe' youtube link in HTML to embed videos and am considering creating a database using PHPmyadmin that uses a form that the client can fill in with the iframe link whenever he needs to embed a new video.
I understand if that sounds like a convoluted way of doing this but if anyone knows a more intuitive way to solve this issue please let me know :)
I think it can be done using a database, may be MySQL, to keep it simple and a simple backend programming language, PHP would do.
You create an admin page just to collect the URLs from the admin. A nice little login page which stores admin username and password in a table. And you can use this module to assign roles to admin, add an admin or delete one.
Inside the module created above, create a page with a form to fetch and store URLs in the database. If you are going to categorize the videos, use suitable columns. Say you are going to categorize based on the genre and length, the table in your database should contain genre and length so that you can use them as filters later on.
Use cookies to create session for the admin. This is to avoid redirecting him to the login page after every single reload.
For all other user who is not admin, do not redirect to the admin page. Redirect them to a common page where they can search and enjoy the videos that the admin has added.
Again, like mentioned in many comments above, wordpress does it all. You have an option to use it too, it's free.

Display a view in the mobile app, from the website (knowing if user comes from application)

Let's see if I can explain this and make sense at the same time...
I've a website and a mobile app for that website. In some cases I'll use the view on the website in the app. I wont redirect the user to the website, but the website view will display in the application. But in this view I'll also make some changes. Lika media queries but at the same time not, because it the user is using a mobile but not the app, a the website will show as normal, but responsive of course.
UPDATE (trying to explain a bit better)
If you use the mobile app, only a specific part of the website will be shown. In this case it will be product categories. Header, footer and all the other content will be gone except for the product categories. If you use the mobile and for example safari, the header, footer and all that stuff will be shown.
So the question here is how can I check, with using PHP, if the user is coming from a mobile app and thereafter display a certain view?
You could have the mobile application open up the website with a GET variable set. Example would be the mobile app would open up www.myWebsite.com?source=mobile.
This way when the php is run on the page you can have something like:
if ((isset($_GET['source'])) && ($_GET['source'] == 'mobile'))
{
//This is a mobile view!
}
A downfall to this would be that anyone could just type ?source=mobile to trick the browser into thinking it's coming from the app.
If you use third party mobile application you can try figure out the source of the request by HTTP User-Agent. Usually issues about views solving by CSS with #media rules.
I hope I got your problem.
You can check by user agents. Using _SERVER['HTTP_USER_AGENT'];
Or else you can use 3rd party libraries like
http://mobiledetect.net or you can use bootstrap library for do that.
If you want to do different design for mobile and desktop then I prefer to you to use bootstrap library.

Web app integrated in a CMS

I want to build a site that allows user to get filtered subsets of some information I have stored on a database. The information will change frequently and I will want to create a post every now and again to say when new information or features become available, so the site has CMS aspects and web app aspects.
Coming from a desktop programming environment, I have to admit I'm not entirely certain how far a CMS like Drupal or WordPress can take me. The web app would basically be a form with several interdependent options and a Submit button. When the Submit button is clicked, I would want to call a web service that returns the information in one of a variety of formats.
So will I need to install my CMS and have an iframe or just a link to the web app which is developed completely outside of it, or can I build my web app's front end directly in the CMS (and hopefully achieve a nice, integrated look) and just have it call the service behind a button?
I half-expect that this web app would become a WordPress/Drupal plugin, or am I barking up the wrong tree?
Yes you can do this. Try out jquery, I think is already included (at leas in wordpress). "Break into" the layot part add js file of your own that makes an AJAX call to a url and returns some HTML from there and puts that content into a certain HTML component (a div that you also add there by your self with a certain id).
User clicks a button, triggers the ajax call and voila.
There are also complicated ways like developing your own plugin that makes SOAPCalls and integrated with the wordpress/drupal etc. But as you say that you are a desktop developer this is the closest aproach to you knwoledge and perspective.
AJAX LOAD WITH JQUERY

Help - use PHP-broswer, or proxy or get_page_contents or include page, or something else?

I am trying to develop a web application for which I need to capture a specific user-driven event (such as mouse dblclick) occurring on a different-website page loaded through my website.
What I want to do is :
User visits my website - hosted by me.
There, user types in any website URL (e.g.: http://www.example.com)
That URL page gets loaded as is.
When user double-clicks mouse over any link or image from that page, a popup/side-panel is displayed with content related to that particular image or link.
I can do this with a combination of PHP get_page_contents or include-page, and javascript dblclick.
However, when user clicks on any link or submits a form, the control goes to that other website, where I cannot show the side-panel.
I might be able to handle the links by proxifying them when user clicks on any of them. How do I handle forms submission and other stuff ?
I can use a full-featured proxy, but that will be too heavy just for the purpose of capturing the event.
My question is that is there a way to write some kind of light PHP script that sits on my website - that loads other websites contents as is, but lets me capture the mouse-dblclick event to show related-content in the side panel .
I have already searched the internet, but could not find anything.
Any help is really appreciated. Thanks.
This sounds way too complicated to ever get reliably working IMO. Proxifying complex requests on 3rd-party pages? Maybe even with some additional AJAX that you'd have to proxify too? I may be wrong, but I think you'll go crazy and get swamped with complaints about sites not working.
I don't know what your web application is supposed to do but I would strongly consider building a Firefox extension (that has much more rights to access and do things on 3rd party sites) or similar.

how can google find me if I am inside a mysql table?

I am creating a classifieds website.
Im storing all ads in mysql database, in different tables.
Is it possible to find these ads somehow, from googles search engine?
Is it possible to create meta information about each ad so that google finds them?
How does major companies do this?
I have thought about auto-generating a html-page for each ad inserted, but 500thousand auto-generated html pages doesn't really sound that good of a solution!
Any thoughts and idéas?
UPDATE:
Here is my basic website so far:
(ALL PHP BASED)
I have a search engine which searches database for records.
After finding and displaying search results, you can click on a result ('ad') and then PHP fetches info from the database and displays it, simple!
In the 'put ad' section of my site, you can put your own ad into a mysql database.
I need to know how I should make google find ads in my website also, as I dont think google-crawler can search my database just because users can.
Please explain your answers more thoroughly so that I understand fully how this works!
Thank you
Google doesn't find database records. Google finds web pages. If you want your classifieds to be found then they'll need to be on a Web page of some kind. You can help this process by giving Google a site map/index of all your classifieds.
I suggest you take a look at Google Basics and Creating and submitting SitemapsPrint
. Basically the idea is to spoon feed Google every URL you want Google to find. So if your reference your classifieds this way:
http://www.mysite.com/classified?id=1234
then you create a list of every URL required to find every classified and yes this might be hundreds of thousands or even millions.
The above assumes a single classified per page. You can of course put 5, 10, 50 or 100 on a single page and then create a smaller set of URLs for Google to crawl.
Whatever you do however remember this: your sitemap should reflect how your site is used. Every URL Google finds (or you give it) will appear in the index. So don't give Google a URL that a user couldn't reach by using the site normally or that you don't want a user to use.
So while 50 classifieds per page might mean less requests from Google, if that's not how you want users to use your site (or a view you want to provide) then you'll have to do it some other way.
Just remember: Google indexes Web pages not data.
How would you normally access these classifieds? You're not just keeping them locked up in the database, are you?
Google sees your website like any other visitor would see your website. If you have a normal database-driven site, there's some unique URL for each classified where it it displayed. If there's a link to it somewhere, Google will find it.
If you want Google to index your site, you need to put all your pages on the web and link between them.
You do not have to auto-generate a static HTML page for everything, all pages can be dynamically created (JSP, ASP, PHP, what have you), but they need to be accessible for a web crawler.
Google can find you no matter where you try to hide. Even if you can somehow fit yourself into a mysql table. Because they're Google. :-D
Seriously, though, they use a bot to periodically spider your site so you mostly just need to make the data in your database available as web pages on your site, and make your site bot-friendly (use an appropriate robots.txt file, provide a search engine-friendly site map, etc.) You need to make sure they can find your site, so make sure it's linked to by other sites -- preferably sites with lots of traffic.
If your site only displays specific results in response to search terms you'll have a harder time. You may want to make full lists of the records available for people without search terms (paged appropriately if you have lots of data).
First Create a PHP file that pulls the index plus human readable reference for all records.
That is your main page broken out into categories (like in the case of Craigslist.com - by Country and State).
Then each category link feeds back to the php script the selected value regardless of level(s) finally reaching the ad itself.
So, If a category is selected which contains more categories (like states contain cities) Then display the next list of categories. Else display the list of ads for that city.
This will give Google.com a way to index a site (aka mysql db) dynamically with out creating static content for the millions (billions or trillions) of records involved.
This is Just an idea of how to get Google.com to index a database.

Categories