News Service in PHP - php

What would be the best way of allowing a news service to be used by many external websites? I want to allow external websites to use some of my functions to display news. What do you suggest?

I would set up RSS feeds for each item you wanted to let people use. Not sure what sort of system you're running, but WordPress allows you to make custom RSS feeds based on almost anything you want; custom post types, categories, etc.

If your news articles are stored in a database table then you can easily extract them using php and populate an rss feed with them.
Then all you do is provide the link to those you wish to use it.
The benefit being it should be straight-forward for them to extract the information they need from it aswell.
An example of how to do this is: http://www.carronmedia.com/create-an-rss-feed-with-php/

Related

How to fetch data from different blogs into a single web Portal?

I am Working in PHP MySql project, I Have A Page Called Live Information, And Client need this page to function like all the information from different blogs related to some specific topic must be displayed on this page.
So Any Direction On how can it be done?
If the blogs give out an RSS feed you can use an RSS library like Magpie to get at the data.
If they don't, you'll need to get their HTML and parse. You'll most probably have to write a parser for each site. Have a look at web scraping.

KnowledgeBase article creation mechanism for php, jquery, mysql

I am looking to create a general knowledgeBase for my customer service department which takes information (my thought is from a mysql database) and populates a page with that information based on the content pulled.
Take this site for example:
ToastMaster
I would like to store a primary title, main content, images, etc on my site without actually having to create unique html for each page.
Can anybody make a recommendation for existing suites I could integrate to do so?
Any of a number of content management systems could be easily used for this. Joomla!, Drupal, etc. Try a few out here: http://php.opensourcecms.com/

PHP, MySQL: Display only required parts of my website in sister website

Now I have my website built on PHP & Mysql. Consider this like a forum. Now when a user posts a reply in my website 1 (ex. www.website1.com), I want to be able to show the starting thread and it's related replies in a sister website of mine. I want to do this in a way that it does not show the rest of the page & other page contents (like logo etc.). I don't think iframe would be a solution because an iframe would embed the whole page and the users visiting my sister website (totally different domain i.e. www.website2.com) would be able to see all the page contents, like logo etc. I want to avoid that. I want to make them see only limited information from website 1 and only the info. that I intend.
I hope that makes sense. In a way, you could say that I am trying to replicate my 1 website, and show only a limited part of it. Users browsing 2nd website can post a reply in the 2nd website and it should automatically be posted & visible to the visitors of the website 1. Users of website 1 should not know that a user of website 2 has posted it. They would feel that some user from website 1 has posted it. Do I have to use 2 separate mysql DB or just 1? I think it would be problematic if I am trying to use different DB. I also feel I might have to face DB connectivity issues as I can connect to only 1 DB at a time.
It's basically like users of website1.com should feel that they are replying to users of website1.com & users of website2.com should feel that they are replying to users of website2.com. (I need it this way to bridge the gap between them). At the same time I want to make the front end of the websites different so that they don't feel that they are replying to some other users outside the domain. These websites would be under my control and I will have access to the source code at any time. If I need to change the source code, these changes are welcome.
Is this really possible?
Thank you in advance.
I'd recommend generating RSS (might be runtime) and using it on the sister website. If RSS is not suitable for your needs, you can create your own XML-based format (or any other :) )
Make two forums which use one database. Both websites would put new messages in the same database.
Make an API for website1, so that website2 can retrieve and post messages on the forum. Website2 would do a HTTP request to website1, which returns XML or JSON, so that website2 can request a list of posts that it can display.
Have both sites connect to the same database and display the content they pull in whatever way is appropriate for the particular site. Each site can only pull the fields relevant to that site.
If the idea is to have two websites with the same data but different presentations, then you would want to simply share a single database between them - assuming they are hosted in the same place and can both get at the database.
You can then just create different PHP pages that both access the same database in the same way but display the data differently.
The best way to do this would be to have a shared library of functions or classes that both sites use to manipulate the data. You would then build a different "presentation layer" on top of that for each site.

Generate a list of all the pages contained in a website programmatically, using PHP

How is it possibe to generate a list of all the pages of a given website programmatically using PHP?
What I'm basically trying to achieve is to generate something like an sitemap, in nested unordered list with links for all the pages contained in a website.
If all pages are linked to one another, then you can use a crawler or spider to do this.
If there are pages that are not all linked you will need to come up with another method.
You can try this:
Add an "image bug/web beacon/web
bug" to each page you tracked as
follows:
OR
alternatively add a javascript function to each page that makes a call to /scripts/logger.php You can use any of the javascript libraries that make this super simple like Jquery, Mootools, or YUI.
Create the logger.php script, have it save the request's originating URL somewhere like a file or a database.
Pros:
- Fairly simple
Cons:
Requires edits to each page
Pages that aren't visited don't get
logged
Some other techniques that don't really fit your need to do it programatically but may be worth considering include:
Create a spider or crawler
Use a ripper such as CURL, or
Teleport Plus.
Using Google Analytics (similar to
the image bug technique)
Use a log analyzer like Webstats or a
freeware UNIX webstats analyzer
You can easly list the files with the glob function... But if the pages uses includes/requires and other stuff to mix multiple files into "one page" you'll need to import the Google "site:mysite.com" search results.. Or just create a table with the URL of every page :P
Maybe this can help:
http://www.xml-sitemaps.com/ (SiteMap Generator)

how can google find me if I am inside a mysql table?

I am creating a classifieds website.
Im storing all ads in mysql database, in different tables.
Is it possible to find these ads somehow, from googles search engine?
Is it possible to create meta information about each ad so that google finds them?
How does major companies do this?
I have thought about auto-generating a html-page for each ad inserted, but 500thousand auto-generated html pages doesn't really sound that good of a solution!
Any thoughts and idéas?
UPDATE:
Here is my basic website so far:
(ALL PHP BASED)
I have a search engine which searches database for records.
After finding and displaying search results, you can click on a result ('ad') and then PHP fetches info from the database and displays it, simple!
In the 'put ad' section of my site, you can put your own ad into a mysql database.
I need to know how I should make google find ads in my website also, as I dont think google-crawler can search my database just because users can.
Please explain your answers more thoroughly so that I understand fully how this works!
Thank you
Google doesn't find database records. Google finds web pages. If you want your classifieds to be found then they'll need to be on a Web page of some kind. You can help this process by giving Google a site map/index of all your classifieds.
I suggest you take a look at Google Basics and Creating and submitting SitemapsPrint
. Basically the idea is to spoon feed Google every URL you want Google to find. So if your reference your classifieds this way:
http://www.mysite.com/classified?id=1234
then you create a list of every URL required to find every classified and yes this might be hundreds of thousands or even millions.
The above assumes a single classified per page. You can of course put 5, 10, 50 or 100 on a single page and then create a smaller set of URLs for Google to crawl.
Whatever you do however remember this: your sitemap should reflect how your site is used. Every URL Google finds (or you give it) will appear in the index. So don't give Google a URL that a user couldn't reach by using the site normally or that you don't want a user to use.
So while 50 classifieds per page might mean less requests from Google, if that's not how you want users to use your site (or a view you want to provide) then you'll have to do it some other way.
Just remember: Google indexes Web pages not data.
How would you normally access these classifieds? You're not just keeping them locked up in the database, are you?
Google sees your website like any other visitor would see your website. If you have a normal database-driven site, there's some unique URL for each classified where it it displayed. If there's a link to it somewhere, Google will find it.
If you want Google to index your site, you need to put all your pages on the web and link between them.
You do not have to auto-generate a static HTML page for everything, all pages can be dynamically created (JSP, ASP, PHP, what have you), but they need to be accessible for a web crawler.
Google can find you no matter where you try to hide. Even if you can somehow fit yourself into a mysql table. Because they're Google. :-D
Seriously, though, they use a bot to periodically spider your site so you mostly just need to make the data in your database available as web pages on your site, and make your site bot-friendly (use an appropriate robots.txt file, provide a search engine-friendly site map, etc.) You need to make sure they can find your site, so make sure it's linked to by other sites -- preferably sites with lots of traffic.
If your site only displays specific results in response to search terms you'll have a harder time. You may want to make full lists of the records available for people without search terms (paged appropriately if you have lots of data).
First Create a PHP file that pulls the index plus human readable reference for all records.
That is your main page broken out into categories (like in the case of Craigslist.com - by Country and State).
Then each category link feeds back to the php script the selected value regardless of level(s) finally reaching the ad itself.
So, If a category is selected which contains more categories (like states contain cities) Then display the next list of categories. Else display the list of ads for that city.
This will give Google.com a way to index a site (aka mysql db) dynamically with out creating static content for the millions (billions or trillions) of records involved.
This is Just an idea of how to get Google.com to index a database.

Categories