I want to get data from XML API output from Vimeo.
In Vimeo if we load this URL: http://vimeo.com/api/v2/video/30055721.xml the video ID is 30055721, it will output the XML data in the browser (single-line XML chunk):
<?xml version="1.0" encoding="UTF-8"?><videos><video><id>30055721</id><title>[MV]I-ny(아이니) 뮤직비디오</title><description>눈부신 가을 하늘을 닮은 목소리의 주인공 '아이니(i-ny)', <br /> 그녀의 이름을 노래하다.</description><url>http://vimeo.com/30055721</url><upload_date>2011-10-04 22:34:19</upload_date><mobile_url>http://vimeo.com/m/30055721</mobile_url><thumbnail_small>http://b.vimeocdn.com/ts/201/671/201671639_100.jpg</thumbnail_small><thumbnail_medium>http://b.vimeocdn.com/ts/201/671/201671639_200.jpg</thumbnail_medium><thumbnail_large>http://b.vimeocdn.com/ts/201/671/201671639_640.jpg</thumbnail_large><user_id>2991448</user_id><user_name>Deviljoon</user_name><user_url>http://vimeo.com/user2991448</user_url><user_portrait_small>http://b.vimeocdn.com/ps/217/387/2173872_30.jpg</user_portrait_small><user_portrait_medium>http://b.vimeocdn.com/ps/217/387/2173872_75.jpg</user_portrait_medium><user_portrait_large>http://b.vimeocdn.com/ps/217/387/2173872_100.jpg</user_portrait_large><user_portrait_huge>http://b.vimeocdn.com/ps/217/387/2173872_300.jpg</user_portrait_huge><stats_number_of_likes>3</stats_number_of_likes><stats_number_of_plays>542</stats_number_of_plays><stats_number_of_comments>0</stats_number_of_comments><duration>235</duration><width>1280</width><height>720</height><tags>I-ny, 아이니, 뮤직비디오, music video, MV, kpop, k-pop, 550d</tags><embed_privacy>anywhere</embed_privacy></video></videos>
But I want to retrieve data in the XML field dynamically to show it in my webpage.
Check out this article for a complete run through:
http://ditio.net/2008/06/19/using-php-curl-to-read-rss-feed-xml/
This should give you a good idea about how to fetch the XML contents into your PHP script, then parse the contents of the XML into your PHP. You will need to make some adaptations to the process of parsing the feed, specific to the vimeo output, but you should be able to do this simply by having a play.
e.g. the below will output the ID.
$ch = curl_init("http://vimeo.com/api/v2/video/30055728.xml");
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_HEADER, 0);
$data = curl_exec($ch);
curl_close($ch);
$xml= new SimpleXmlElement($data, LIBXML_NOCDATA);
echo "<strong>".$xml->video->id."</strong>";
Once $xml has been established, simply change $xml->video->id to whichever node you want (crucually the 'id' section').
Related
I am working on a project where I must pull data from an XML API using PHP cURL.
I am unable to retrieve very large data sets from the API, for example I have no issue pulling 500 user email addresses. But if I am pulling a list of 15,000 email addresses I only get about 4,000 back and the request takes about 3 minutes.
I have adjusted the timeout in php.ini which did not help.
My thought is to retrieve the data in batches and then append to the data pulled in. I am pretty new to PHP/back end scripting so forgive me if this seems like a simple question.
My research has shown array_chunk may be useful, but wouldn't that need the full amount of data to chunk anyway? Or could I pull the data in as a chunk, go back to where I left off in the data set and rinse/repeat?
Here is the code I have so far.
function load_users($username, $token, $path, $list_id)
{
$xml = '<xmlrequest>
<username>'.$username.'</username>
<usertoken>'.$token.'</usertoken>
<requesttype>subscribers</requesttype>
<requestmethod>GetSubscribers</requestmethod>
<details>
<searchinfo>
<List>
'.list_id.'
</List>
<Email></Email>
</searchinfo>
</details>
</xmlrequest>';
$ch = curl_init($path);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_POST, 1);
curl_setopt($ch, CURLOPT_POSTFIELDS, $xml);
$result = #curl_exec($ch);
curl_close($ch);
$xml_doc = simplexml_load_string($result);
return $xml_doc;
}
I'm trying to figure it out how to take information from another site(with different domain name) and place it in my php program.
Explanation:
User inputs URL from another site.
jQuery or PHP takes information from entered URL. I know where the information is (i know its' divs ID)
And that var is put into my php program as a variable $kaina, for example.
EX:
User enters URL:http://www.sportsdirect.com/lee-cooper-bud-mens-boots-118358
And I want to get the Price. (27,99)
What lang should I use? PHP? or jquery? or anything else?
What function should I use?
How should the program look like?
Thank you for your answers :)
I'd say you have to use php (curl or file_get_contents) to download the page on to your server, parse it or use regular expression to get the price. But in this case it will be even trickier because it looks like this link leads to a page that uses javascript.
But you have to know the format of how you are going to extract the data. So PHP will do the job.
PHP's cURL library should do the trick for you: http://php.net/manual/en/book.curl.php
<?php
$ch = curl_init("http://www.example.com/");
$fp = fopen("example_homepage.txt", "w");
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_exec($ch);
curl_close($ch);
fclose($fp);
?>
You need to research on each of the step mentioned below,
One Thing That you can do is, post the message entered by the user to the server means PHP file, there you can extract the url entered by the user,
In order to extract the URL from the user post, you can use regex search:-
Check this link out:-
Extract URLs from text in PHP
Know you can curl to the url extracted from the user input.
$ch = curl_init();
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt ($ch, CURLOPT_URL, $extracted_url );
$html = curl_exec ( $ch );
curl_close($ch);
The curl output will contain the complete html of the page, you can then use a HTML parser
$DOM = new DOMDocument;
$DOM->loadHTML($str);
to parse till the required div is found, to have its value.
I would proabaly do something like this:
get the contents of the file: $contents = file_get_contents("http://www.sportsdirect.com/lee-cooper-bud-mens-boots-118358")
convert the contents you just got to xml: $xml = new SimpleXMLElement($contents);
search the xml for the node with attribute itemprop="price" using xpath query
read the contents of that node, et voila, you have your price
<?php
$url='http://bart.gov/dev/eta/bart_eta.xml';
$c = curl_init($url);
curl_setopt($c, CURLOPT_MUTE, 1);
curl_setopt($c, CURLOPT_RETURNTRANSFER, 1);
$rawXML = curl_exec($c);
curl_close($c);
$fixedupXML = htmlspecialchars($rawXML);
foreach($fixedupXML->eta-> as $eta) {
echo $eta->destination;
}
?>
As a way to get introduced to PHP, I've decided to parse BART's XML feed and display it on my webpage. I managed (also via this site) to be able to fetch the data and preserve the XML tags. However, when I try to output the XML data, using what I found to be the simplest method, nothing happens.
foreach($fixedupXML->eta as $eta){
echo $eta->destination;
}
Am I not getting the nested elements right in the foreach loop?
Here is the BART XML feed http://www.bart.gov/dev/eta/bart_eta.xml
Thanks!
You may want to look at simplexml, which is a fantastic and really simple way to work with XML.
Here's a great example:
$xml = simplexml_load_file('http://bart.gov/dev/eta/bart_eta.xml');
Then you can run a print_r on $xml to see it's contents:
print_r($xml);
And you should be able to work with it from there :)
If you still need to use curl to get the feed data for some reason, you can feed the XML into simplexml like this:
$xml = simplexml_load_string($rawXML);
I am new to PHP, and I am using it to POST data from an iPhone. I have written a basic code to get data from the iPhone, post it to the php script on my server, and then using PHP, send that data to another webserver. However, the webserver is returning a response in XML, and since I am a newbie to PHP, I need help with it.
My code to send data:
<?php
$ch = curl_init("http://api.online-convert.com/queue-insert");
$count = $_POST['count'];
$request["queue"] = file_get_contents($count);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_POSTFIELDS, $request);
$response = curl_exec($ch);
curl_close ($ch);
echo $response;
?>
I know I need to parse the XML response, but I have no idea how to do that. The XML response would be something like this:
<?xml version="1.0" encoding="utf-8"?>
<queue-answer>
<status>
<code>0</code>
<message>Successfully inserted job into queue.</message>
</status>
<params>
<downloadUrl>http://www.online-convert.com/result/07d6c1491bb5929acd71c531122d2906</downloadUrl>
<hash>07d6c1491bb5929acd71c531122d2906</hash>
</params>
</queue-answer>
You're probably looking for SimpleXML or DOMDocument.
SimpleXML
To load data from a string into an object: simplexml_load_string().
DOMDocument
To create one from a string: DOMDocument->loadXML().
Read about SimpleXML: http://php.net/manual/en/book.simplexml.php
Google is your friend!
The built-in PHP XML Parser is your best bet.
I'm using jQuery to setup an Ajax request that grabs an XML feed from a PHP script and then parses some information out of the feed and inserts it into the DOM. It works fine in Firefox; however, in Chrome, I am getting an empty string for the title element.
Here's the basic setup of the Ajax request:
$.get('feed.php', function(oXmlDoc) {
$(oXmlDoc).find('entry').each(function() {
$(this).find('title').text();
$(this).find('id').text();
// do other work...
});
});
For what it's worth, here's the PHP script that's grabbing data from the feed. I'm using cURL because I'm making the request across domains (and because it was a quick and dirty solution for the problem at hand).
$curl = curl_init();
curl_setopt($curl, CURLOPT_URL, $str_feed_url);
curl_setopt($curl, CURLOPT_HEADER, false);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, true);
$xml = curl_exec($curl);
curl_close($curl);
echo $xml;
The XML data is being returned correctly and I can see the values of the sibling nodes in Chrome (such as ID), but, for whatever reason, I continue to get an empty string for the title node.
Edit: As requested, here's a fragment of the relevant XML:
<entry>
<id>http://TheAddress.com/feed/01</id>
<title type="text">The Title of the Post</title>
<author><name>Tom</name></author>
<published>2009-11-05T13:46:44Z</published>
<updated>2009-11-05T14:02:19Z</updated>
<summary type="html">...</summary>
</entry>
The page you have in the example XML has an HTML entity in the title. This can cause issues. Here is a blog post I found on this issue.
I wonder if the same goes for other special characters...
The home page title looks like this:
<title>The Address Hotels + Resorts</title>
I haven't tried it, but make sure that the xml is returned with the correct content type (text/xml). You can also set the dataType to xml at the jQuery.ajax(options).
I have the same problem.
It seems than Chrome don't handle html tag in ajax.
try changing "title" to "booktitle" in the XML and in the JS.