How do i intergrate the vimeo search database to my website? - php

I have to make a website that pulls videos from youtube,vimeo,dailymotion etc. and i dont know where to start.I have Youtube's api already intergrated and vimeo doesn't have a great one,is there anyway i can pull videos from others sites to my site?
Thanks,
Lee

I'm not fond of this kind of stuff, but you could parse html return from a search on vimeo, load the html into DOM and do some xpath query for tags that have "/0123456789" links.
Or maybe ask them if they have such API, i'm pretty surprised they don't have one to be honnest...

Related

How To Read Vimeo RSS Feed Using PHP

I want to read the rss feed for my videos from Vimeo using a PHP script so I can update a database table with video data, however even though I can read the feed in my browser, Vimeo returns a 500 error.
I looked at their API documentation which is about as clear as mud and wants me to create an API. I don't want an API for visitors to use, I just want to read a feed (which would normally be simple using curl.
I haven't found any help at all from their documentation, hopefully someone could make it a bit clearer for me please.
Thanks in advance.
I finally managed to resolve this. Feeling rather dumb now, the content of the curled rss feed from Vimeo did not display on the screen when echoed because of some of the content. It was however returned. I needed to view the screen source to see the content.

Rendering a pdf/ppt document in a jquery mobile web page

I am developing a jquery-mobile learning portal. And i would like to render pdf and ppt documents like slideshare does or google books. i have tried lloking around but i can't seem to get any API that can enable me do that. I saw PDF.js but it seems too heavy for mobile and besides it requires a lot of work styling it.
Any assistance ... will be appreciated
Have you checked out Scribd's API?

News feed for web page

I'm building a website that will have a news feed (similar in idea to facebook and twitter) ...
Other sections in the website will also be using the same idea.
I've already build the DB and working on the HTML/PHPs with my team. I need to know how to create a news feed which is scrollable like facebook. Don't need to set a limit to it.
UPDATE
After further searching, I found that what I need is news ticker ...
You should consider Feed Creator.
It'll do everything you want without reinventing the wheel.

how to create URL extractor like facebook share

i need to extract data from url
like title , description ,and any vedios images in the given url
like facebook share button
like this :
http://www.facebook.com/sharer.php?u=http://www.wired.com&t=Test
regards
Embed.ly has a nice api for exactly this purpose. Their api returns the site's oEmbed data if available - otherwise, it attempts to extract a summary of the page like Facebook.
Use something like cURL to get the page and then something like Simple HTML DOM to parse it and extract the elements you want.
If the web site has support for oEmbed, that's easier and more robust than scraping HTML:
oEmbed is a format for allowing an embedded representation of a URL on third party sites. The simple API allows a website to display embedded content (such as photos or videos) when a user posts a link to that resource, without having to parse the resource directly.
oEmbed is supported by sites like YouTube and Flickr.
I am working on a project for this issue, it is not as easy as writing an html parser and expecting sites to be 'semantical'. Especially extracting videos and finding auto-play parameters are killing. You can check the project in http://www.embedify.me, which has also fb-style url preview script. As I see, embed.ly and oembed are passive parser, they need the sites to support them, so called providers, the approach is quite different than fb does.
While I was looking for a similar functionality, I came across a jQuery + PHP demo of the url extract feature of Facebook messages:
http://www.99points.info/2010/07/facebook-like-extracting-url-data-with-jquery-ajax-php/
Instead of using an HTML DOM parser, it works with simple regular expressions. It looks for title, description and img tags. Hence, the image extraction doesn't perform well with a lot of websites, which use CSS for images. Also, Facebook looks first at its own meta tags and then at the classic description tag of HTML but it illustrates well the principe.

How to create an RSS feed and display it?

On a website I am maintaining for a radio station they have a page that displays news articles. Right now the news is posted in an html page which is then read by a php page which includes all the navigation. I have been asked to make this into and RSS feed. How do I do this? I know how to make the XML file but the person who edits the news file is not technical and needs a WYSIWYG editor. Is there a WYSIWYG editor for XML? Once I have the feed how do I display it on my site? Im working with PHP on this site so a PHP solution would be preferred.
Use Yahoo Pipes! : you don't need programming knowledge + the load on your site will be lower. Once you've got your feed, display it on your site using a simple "anchor" with "image" in HTML. You could consider piping your feed through Feedburner too.
And for the freeby: if you want to track your feed awareness data in rss, use my service here.
Are you meaning that someone will insert the feed content by hand?
Usually feeds are generated from the site news content, that you should already have into your database.. just need a php script that extract it and write the xml.
Edit: no database is used.
Ok, now you have just 2 ways:
Use php regexp to get the content you need from the html page (or maybe phpQuery)
As you said, write the xml by hand and then upload it, but i havent tryed any wysiwyg xml editor, sorry.. there are many on google
Does that PHP site have a database back end? If so, the WYSIWYG editor posts into there then a special PHP file generates an RSS feed.
I've used the following IBM page as a guide and it worked wonderfully:
http://www.ibm.com/developerworks/library/x-phprss/
I decided that instead of trying to find a WYSIWYG for XML that I would let the news editor continue to upload the news as HTML. I ended up writing a php program to find the <p> and </p> tags and creating an XML file out of it.
You could use rssa.at - just put in your URL and it'll create a RSS feed for you. You can then let people sign up for alerts (hourly/daily/weekly/monthly) for free, and access stats.
If the HTML is consistent, you could just have them publish as normal and then scrape a feed. There are programatic ways to do this for sure but http://www.dapper.net/dapp-factory.jsp is a nice point and click feed scraping service. Then, use either MagpieRSS, SimplePie or Feed.informer.com to display the feed.

Categories