Use 'foreach' in script - php

I'm trying to write a script that list all the image URL from an specific URL. I used foreach in order to scan several pages, but I think os not working well.
This is my code:
<?php
include('simple_html_dom.php');
$array = array($page, $page2);
$page = "https://www.dllo.dev";
$page2 = "https://www.dllo2.dev";
$html = new simple_html_dom();
$html->load_file($array);
$images = array();
foreach($html->find('img') as $element) {
$images[] = $element->src;
}
reset($images);
echo "URL $array:<br /><br />";
foreach ($images as $out) {
$url = "$base$out";
echo "$url,&nbsp";
}
It's partially working, but only with the first URL ($page)... Any idea?
:D

Related

i trying to download images but i get only one image

so i trying to download images from this url = https://mangaarabteam.com/manga/yuan-zun/94/
but i get only one image
this code in: manga.php
include('simple_html_dom.php');
$url = 'https://mangaarabteam.com/manga/yuan-zun/94/';
$html = file_get_html($url);
foreach($html->find('img') as $e){
$image_links = $e->src;
$images_url = array();
array_push( $images_url, $image_links);
this code in : index.php
foreach( $images_url as $image ){
print_r($image);
}
i want the output $image multiple images not only on image
Try the below code.
you're setting your empty array($images_url = array();) inside the foreach it should be before the foreach.
$url = 'https://mangaarabteam.com/manga/yuan-zun/94/';
$html = file_get_html($url);
$images_url = array();
foreach($html->find('img') as $e){
$image_links = $e->src;
array_push( $images_url, $image_links);
}
foreach( $images_url as $image ){
print_r($image);
}

PHPQuery - get all links of contains specific url page

I am trying to get all links of contains specific url page on a given page using PHPQuery. I am using the PHP support syntax of PHPQuery.
include_once 'phpQuery.php';
$url = 'http://www.phonearena.com/phones/manufacturer/';
$doc = phpQuery::newDocumentFile($url);
$urls = $doc['a'];
foreach ($urls as $url) {
echo pq($url)->attr('href') . '<br>';
}
The code above works . But it shows all the links
I want to show only those containing "/phones/manufacturer/".
I tried this but it shows nothing:
include_once 'phpQuery.php';
$url = 'http://www.phonearena.com/phones/manufacturer/';
$doc = phpQuery::newDocumentFile($url);
$urls = $doc['a'];
foreach ($urls as $url) {
echo pq($url)->attr('href:contains("/phones/manufacturer/")') . '<br>';
}
Use below coding get all urls from that site,
$doc = new DOMDocument();
#$doc->loadHTML(file_get_contents('http://www.phonearena.com/phones/manufacturer/'));
$ahreftags = $doc->getElementsByTagName('a');
foreach ($ahreftags as $tag) {
echo "<br/>";
echo $tag->getAttribute('href');
echo "<br/>";
}
exit;
Try this, a little italian guide, jquery documentation
include_once 'phpQuery.php';
$url = 'http://www.phonearena.com/phones/manufacturer/';
$doc = phpQuery::newDocumentFile($url);
$urls = $doc['a[href*="/phones/manufacturer/"]'];
foreach ($urls as $url) {
echo pq($url)->attr('href') . '<br>';
}

Fetch images URL from XML

I am using the below code to fetch the $movie->id from the response XML
<?php
$movie_name='Dabangg 2';
$url ='http://api.themoviedb.org/2.1/Movie.search/en/xml/accd3ddbbae37c0315fb5c8e19b815a5/%22Dabangg%202%22';
$xml = simplexml_load_file($url);
$movies = $xml->movies->movie;
foreach ($movies as $movie){
$arrMovie_id = $movie->id;
}
?>
the response xml structure is
How to fetch image URL with thumb size?
See the below an easy way to get only specific images.
$xml = simplexml_load_file($url);
$images = $xml->xpath("//image");
//echo "<pre>";print_r($images);die;
foreach ($images as $image){
if($image['size'] == "thumb"){
echo "URL:".$image['url']."<br/>";
echo "SIZE:".$image['size']."<br/>";
echo "<hr/>";
}
}
Use the attributes() method of SimpleXmlElement.
Example:
$imageAttributes = $movie->images[0]->attributes();
$size = $imageAttributes['size'];
See the documentation at: http://www.php.net/manual/en/simplexmlelement.attributes.php
EDIT: select only URL attributes with size = "thumb" and type = "poster":
$urls = $xml->xpath("//image[#size='thumb' and #type='poster']/#url");
if you expect only 1 url, do:
$url = (string)$xml->xpath("//image[#size='thumb' and #type='poster']/#url")[0];
echo $url;
working live demo: http://codepad.viper-7.com/wdmEay

How can I login to YouTube and get the content of this page?

Please help me get the content of page like:
http://www.youtube.com/my_videos_annotate?feature=vm&v=someVideoId
using PHP/Curl.
I think I need first log into YouTube's service, but don't know how to do this.
Use youtube-api
Example
$feedURL = 'http://gdata.youtube.com/feeds/api/users/**ACCOUNTNAME**/uploads?max-results=50';
$sxml = simplexml_load_file($feedURL);
$i=0;
foreach ($sxml->entry as $entry) {
$media = $entry->children('media', true);
$watch = (string)$media->group->player->attributes()->url;
$thumbnail = (string)$media->group->thumbnail[0]->attributes()->url;
$input = array('http://www.youtube.com/watch?v=','&feature=youtube_gdata_player');
$change = array('','') ;
$link = #str_replace ($input ,$change, $watch);
echo $link.'</br>';
echo $thumbnail.'</br>'; //**thumbnail
echo $media->group->title.'</br>'; //**title
echo $media->group->description.'</br>'; //**description
}
Hope This will help

Loading content from remote site doesn't work, but why?

I'm still working on this catalogue for a client, which loads images from a remote site via PHP and the Simple DOM Parser.
// Code excerpt from http://internetvolk.de/fileadmin/template/res/scrape.php, this is just one case of a select
$subcat = $_GET['subcat'];
$url = "http://pinesite.com/meubelen/index.php?".$subcat."&lang=de";
$html = file_get_html(html_entity_decode($url));
$iframe = $html->find('iframe',0);
$url2 = $iframe->src;
$html->clear();
unset($html);
$fullurl = "http://pinesite.com/meubelen/".$url2;
$html2 = file_get_html(html_entity_decode($fullurl));
$pagecount = 1;
$titles = $html2->find('.tekst');
$images = $html2->find('.plaatje');
$output='';
$i=0;
foreach ($images as $image) {
$item['title'] = $titles[$i]->find('p',0)->plaintext;
$imagePath = $image->find('img',0)->src;
$item['thumb'] = resize("http://pinesite.com".str_replace('thumb_','',$imagePath),array("w"=>225, "h"=>162));
$item['image'] = 'http://pinesite.com'.str_replace('thumb_','',$imagePath);
$fullurl2 = "http://pinesite.com/meubelen/prog/showpic.php?src=".str_replace('thumb_','',$imagePath)."&taal=de";
$html3 = file_get_html($fullurl2);
$item['size'] = str_replace(' ','',$html3->find('td',1)->plaintext);
unset($html3);
$output[] = $item;
$i++;
}
if (count($html2->find('center')) > 1) {
// ok, multi-page here, let's find out how many there are
$pagecount = count($html2->find('center',0)->find('a'))-1;
for ($i=1;$i<$pagecount; $i++) {
$startID = $i*20;
$newurl = html_entity_decode($fullurl."&beginrec=".$startID);
$html3 = file_get_html($newurl);
$titles = $html3->find('.tekst');
$images = $html3->find('.plaatje');
$a=0;
foreach ($images as $image) {
$item['title'] = $titles[$a]->find('p',0)->plaintext;
$item['image'] = 'http://pinesite.com'.str_replace('thumb_','',$image->find('img',0)->src);
$item['thumb'] = resize($item['image'],array("w"=>225, "h"=>150));
$output[] = $item;
$a++;
}
$html3->clear();
unset ($html3);
}
}
echo json_encode($output);
So what it should do (and does with some categories): Output the images, the titles and the the thumbnails from this page: http://pinesite.com
This works, for example, if you pass it a "?function=images&subcat=antiek", but not if you pass it a "?function=images&subcat=stoelen". I don't even think it's a problem with the remote page, so there has to be an error in my code.
Ehm..trying to state the obvious maybe but 'stoele'?
As it turns out, my code was completely fine, it was a missing space in the HTML of the remote site that got the Simple PHP DOM Parser to not recognize the iframe I was looking for. I fixed it on my end by running a str_replace on the code first to replace the faulty code.
I know it's a dirty solution, but it works :)

Categories