How do I extract the blog posts like this site did? - php

This site is built on Ning. You'll notice they have jQuery tabs set up on the home page and looking through the source code, you'll see that those tabs are getting their content from an outside url (below):
<div class="ui-tabs" id="tabs">
<ul>
<li>Features</li>
<li>Vip Blogs</li>
<li>All</li>
</ul>
</div>
However, those urls aren't standard to Ning (I've tried appending /vip/blog/embedPromoted?pageSize=10 on a similar Ning blog url and it didn't work) which leads me to believe that they were created separately somehow to extract just the blog posts. Here is what a blog page on Ning looks like for reference: link
Anybody have an idea of how they created these pages with just the blog posts? I originally thought of using the blog's rss feed but realized the rss doesn't include author avatars and certain post metadata information like how it is in the first link I posted above.
Any help would be hugely appreciated.
Thanks to all in advance!

The /vip URLs on that Ning site is a custom feature from when Ning used to host custom PHP code. Since it's a custom feature, it's not available on other sites.
It's possible to create something similar using the Ning API to aggregate blog content from a specific set of members into a single HTML page or RSS feed. It would have to be implemented on an external server.

Check out rssinclude.com, handy way to drop RSS feeds into a site.
If that won't work, you can use the QueryPath library to grab the HTML from the site jQuery style, but in PHP.

What do you mean by rss feeds don't include the author's avatar ? It is included in the link you have in your OP.
One way this could have been implemented is that they apply a XSLT temnplate to a RSS feed to build the HTML page .

Related

Whats the best way to change the content urls in the WordPress Rest API results?

Say I have api.wordpresssite.com where I will be entering data, uploading images and so on. Then saw that I want to consume the WordPress API on another site like mysite.com.
WordPress will then assume that every link in content will be api.wordpresssite.com It will also embed images with the same URL because the links and media are absolute.
Am I supposed to process the content on mysite.com looking for links but ignoring media or is there a plugin or function that I can add to my theme to do this?
I have tried changing the base URL and while it works for links, it breaks media uploads.
There are multiple endpoints so I am thinking that even if there was a function to add, it would be too far down the line of execution to do anything.
For Example, there is the WP API, as well as JetPack, and Yoast that I am using.
There are tonnes of articles on "How to use WordPress in Laravel" but not a single article has talked about how to "normalize" the content for the site that is consuming it.
The API feels like it's only true out of the box use is to be used with some kind Javascript based front end.
What I would like is a headless WordPress API with relative URLs for content links.
I was looking for the same answer and solved it this way:
What you would need to do is set the site URL to www.remote-domain.com.
You can do this by going to WP Admin > Settings > General
Screenshot
Then on your functions.php file or somewhere you can add a filter (plugin etc.) add this filter in.
add_filter( 'rest_url', 'fix_rest_url');
function fix_rest_url( $url ) {
return $url;
}
Got the answer from: https://core.trac.wordpress.org/ticket/49146
My suspicion that no one really uses the WordPress API seems to be confirmed. Mainly on an external site.
I had to create some render methods that looked for HREF links and remap them on the content, and menus.
It still seems silly and not very polished.

Is it better to set social media sharer in entry-footer.php or comments.php in Wordpress blog?

I'm now deciding where should I put social media sharers. Currently, I'm now putting sharers code in entry-footer.php. But is it more semantic to put sharers in entry-footer.php, in the bottom of every post, or comments.php, being together with comment box in Wordpress?
Also, since the entry-footer.php is the default structure only for single post, is it reasonable to ask page.php to load entry-footer.php, too? Or should I just rewrite a block of sharer codes inside page.php? How about loading comments.php with sharer codes inside?
I use and suggest you keep it in entry-footer.php as people are sharing your current single Entry and not its comments on the social media.

Download post as text file on wordpress blog post page

Is there anyway to download individual blog posts as text (or html) files ? I am trying to generate the markup for a html email based on the content of the blog post.
The content of the files would be the excerpt.
What would be the easiest way to generate these files? I am not sure how to approach this problem.
I've used Wordpress to generate email newsletter templates on like a dozen of the sites I've built.
Just create a new page template (see the associated Page Templates Codex page) designed with your markup for the HTML newsletter (usually very simple since most email clients have unbelievably shitty HTML/CSS support that's a solid decade behind the rest of the internet).
Then use template functions and a custom loop to get the blog post or other content from Wordpress you want, display it in the template, and then when you view the page you can just copy the HTML! Note that you probably want to write your own header and footer for this template to keep it slim (rather than using get_header() and get_footer().
Take the post body and do a strip_tags().

Posting full articles from RSS feeds

How can this be done?
Now I use fulltextrssfeed.com and WP robot, but I don't really like it, because fulltextrssfeed.com posts only like 5 articles when it was more than 50 in the original RSS feed. And all the plugins I tested (FeedWordPress, RSS poster, WP-o-matic, WP robot) couldn't post full articles from every feed I used.
The feeds themselves don't contain the full text of the articles.
What fulltextrssfeed.com does is fetching the URL of the feed entry and extracting the full text from there. This is what you have to do, too - if you want all the articles
you can use http://www.feedsapi.com/ or http://fivefilters.org/content-only/
You can use the api provided by full post rss feed, which extracts 10 posts from the rss feed, it is basically hosted version of fiverfilter premium Api. Fivefilter regularly updates their templates to extract the content from the major themes and sites.
If it is not able to extract the rss feed from any site, then you can add the template for that site, that means, template is something which will provide the details of tags and class to extract the content. For example, In wordpress, mostly the content will start from article tag or Div tag with some class. If you know any one programing language, then you can write your own rss parser and full content extracter using curl or any other method. It is very simple, you need some bandwidth to understand the concept.

Facebook Feed (RSS using PHP)

Can someone tell me what happens when i enter a link into the Facebook Status Update Form and it loads up a mini info kinda thing of the website (I'm guessing its RSS or something?)
How do i implement this on my site using PHP?
What do i need to learn to be able to implement that?
It scrapes the page you are linking to. It doesn't have anything to do with RSS.
By looking at the HTML of the page it can get the page title for you and find all the images that can be used as a thumbnail.
Take a look at HTTP or cURL in the PHP manual for methods to get webpage content.

Categories