i want get complete content of a news or post of a website via feed. but we know that many websites only presents some part of news or post via their feed.
of course i know that exists a script called SimplePie that is developed for get content of websites via feed. but this script do not retrieve full content of a news.
of course i found a script called Full-Text Feeds that do It. but it is no free . i want a free script.
Do you know a similar script or way to do my need?
The code behind Five Filters' content extraction is actually open source, and is based on Readability's original Javascript (before they became a service).
You should be able to use it like this:
$page = file_get_contents($item_url);
$readability = new Readability($page);
if ($result = $readability->init()) {
$content = $readability->getContent()->innerHTML;
}
Not entirely sure what you're trying to do here but this might help you:
$full_page_content = file_get_contents('http://www.example.com/');
Edit: Ok, if I understand you correctly you'll need to do something like this:
Get rss feed
Use SimplePie or something like it to go through each feed item
For each item in RSS feed
Get the item's url
Get the content from that URL
Strip out the HTML/extract only the text you need
Combine all of these into a new RSS feed and send that to the user
Note: This isn't a simple thing to do. There is a reason that Full-Text RSS can charge for their product.
You could use http://magpierss.sourceforge.net/cookbook.shtml (free)
It retrieves RSS feeds. There are many many many PHP scripts that do that on the web... Google si your friend !! :)
Related
so i have this search http://www.ncbi.nlm.nih.gov/pubmed/?term=Streptococcus+dysgalactiae+subspecies+equisimilis and i want to get the rss feed(not manually), like a string with the xml, but i want to do it using php. ive tryed to search for a method on ncbi but i guess im not very good at it..
do i have to search in the html for the href? or something like this getting RSS feeds on website ?
The feed URL for the search isn't contained on the page. Clicking 'Create RSS' makes a call to the server to store the search terms and returns a feed with a new rss_guid. If there isn't an API to call you would need to examine the JavaScript and simulate a browser clicking on that button.
(As an implementation detail, the site returns different guids even when the search is the same, which implies that each 'Create RSS' is creating a new resource.)
I'm trying to use some data from another site as a news feed on mine that will automatically update. I have permission to use this information.
I'm trying to decide whether to just use the RSS feed and add that to my site or use the Command Curl.
What do you recommend using?
I want the text to go in to a rectangle space in a div on my page and so i can customize this to go with the colour and design of my page.
Thanks
If there is a RSS, use that! That's where it's for.
Get the text of the RSS, and display it in a div.
You should also link to the page, which is provided in the RSS, as that is in most cases something the owner would want.
Use simplexml_load_file() function to read the RSS, and proceed with it using simpleXML. See also some documentation:
http://nl2.php.net/simplexml_load_file
http://nl2.php.net/manual/en/book.simplexml.php
You should be able to handle it on your own. If you don't, try to do some tutorials on simplexml or even PHP
How can this be done?
Now I use fulltextrssfeed.com and WP robot, but I don't really like it, because fulltextrssfeed.com posts only like 5 articles when it was more than 50 in the original RSS feed. And all the plugins I tested (FeedWordPress, RSS poster, WP-o-matic, WP robot) couldn't post full articles from every feed I used.
The feeds themselves don't contain the full text of the articles.
What fulltextrssfeed.com does is fetching the URL of the feed entry and extracting the full text from there. This is what you have to do, too - if you want all the articles
you can use http://www.feedsapi.com/ or http://fivefilters.org/content-only/
You can use the api provided by full post rss feed, which extracts 10 posts from the rss feed, it is basically hosted version of fiverfilter premium Api. Fivefilter regularly updates their templates to extract the content from the major themes and sites.
If it is not able to extract the rss feed from any site, then you can add the template for that site, that means, template is something which will provide the details of tags and class to extract the content. For example, In wordpress, mostly the content will start from article tag or Div tag with some class. If you know any one programing language, then you can write your own rss parser and full content extracter using curl or any other method. It is very simple, you need some bandwidth to understand the concept.
I am Working in PHP MySql project, I Have A Page Called Live Information, And Client need this page to function like all the information from different blogs related to some specific topic must be displayed on this page.
So Any Direction On how can it be done?
If the blogs give out an RSS feed you can use an RSS library like Magpie to get at the data.
If they don't, you'll need to get their HTML and parse. You'll most probably have to write a parser for each site. Have a look at web scraping.
On a website I am maintaining for a radio station they have a page that displays news articles. Right now the news is posted in an html page which is then read by a php page which includes all the navigation. I have been asked to make this into and RSS feed. How do I do this? I know how to make the XML file but the person who edits the news file is not technical and needs a WYSIWYG editor. Is there a WYSIWYG editor for XML? Once I have the feed how do I display it on my site? Im working with PHP on this site so a PHP solution would be preferred.
Use Yahoo Pipes! : you don't need programming knowledge + the load on your site will be lower. Once you've got your feed, display it on your site using a simple "anchor" with "image" in HTML. You could consider piping your feed through Feedburner too.
And for the freeby: if you want to track your feed awareness data in rss, use my service here.
Are you meaning that someone will insert the feed content by hand?
Usually feeds are generated from the site news content, that you should already have into your database.. just need a php script that extract it and write the xml.
Edit: no database is used.
Ok, now you have just 2 ways:
Use php regexp to get the content you need from the html page (or maybe phpQuery)
As you said, write the xml by hand and then upload it, but i havent tryed any wysiwyg xml editor, sorry.. there are many on google
Does that PHP site have a database back end? If so, the WYSIWYG editor posts into there then a special PHP file generates an RSS feed.
I've used the following IBM page as a guide and it worked wonderfully:
http://www.ibm.com/developerworks/library/x-phprss/
I decided that instead of trying to find a WYSIWYG for XML that I would let the news editor continue to upload the news as HTML. I ended up writing a php program to find the <p> and </p> tags and creating an XML file out of it.
You could use rssa.at - just put in your URL and it'll create a RSS feed for you. You can then let people sign up for alerts (hourly/daily/weekly/monthly) for free, and access stats.
If the HTML is consistent, you could just have them publish as normal and then scrape a feed. There are programatic ways to do this for sure but http://www.dapper.net/dapp-factory.jsp is a nice point and click feed scraping service. Then, use either MagpieRSS, SimplePie or Feed.informer.com to display the feed.