I am building a "Reddit" like site.
The User can post an URL from which I want to get the correct image with PHP.
What I would need is a script which sites like Facebook or Tumblr use to fetch the Images.
I saw already scripts which get the images by getting the HTML Content and searching for "img" tags.
Are there any better methods/scripts available?
Maybe even scripts which will order the images by the size: The bigger the image the more important it is.
Thanks for answers
You may want to check out PHPQuery, it will allow you easily iterate through all images on a given website. You can then work out the areas of each image and sort them accordingly.
It depends a bit for what you're looking for and what the image is that the user would like to have with his post. To give you an example: I once wrote a method that searches for a logo of a company on the company's website. To do so, I searched for, indeed, the img-tags using simple_html_dom and filtered those tags on the existence of logo in the alt-tag. The results are displayed to the user to select the right image; it could be that you find multiple images fitting your purpose.
I would indeed, as you proposed, have a look at the size and skip small images (e.g. smaller dan lets say 50 px).
Related
I wonder if anyone can point me in the right direction.
I have a rather large spreadsheet of product info that needs plugging into a shop. The tricky bit is that the spreadsheet has a link which points to the relevant page on another site which has the products details, and what i need to do is grab that relevant Image and save locally, so I can use later.The reason Why Im thinking down this line is there are 7500 products....
My friend suggested I could maybe use php & filepopen.
The image does have an outer tag ID which I can refer to.
I was thinking of iterating through the spreadsheet this is the type of link I have to work with
http://www.apc.com/resource/include/techspec_index.cfm?base_sku=APCRBC105
the images themselves are called something random, but I figured I could rename them as I grab them to the more relevant SKU number.
so iterate through the spreadsheet by SKU number
identify the image by the relevant id on the page (I'm assumming it's
in the same place on every page)
save the image while renaming to the correct SKU number
Any ideas on how I could go about this ? the thought of visiting each page manually and saving the image 7500 times doesn't seem the best way forward!
Thanks for looking
Rip the base_sku from your links.
APCRBC105
Then use curl to fetch the image page
http://www.apc.com/products/moreimages.cfm?partnum=APCRBC105
Rip the image link with a regex epression on :
<div align="center">
<img align="center" src="http://www.apcmedia.com/resource/images/500/Front_Left/35531838-5056-9170-D33F24AE47742E6C_pr.jpg" />
</div>
Then use curl again to rip the actual image and save it.
That should work..
If there aren't any issues regarding copyrighted material, take a look at Google Refine.
You can grab content from websites based on your cell values and use them afterwards to build more complex scenarios.
See the screencasts for more info (screencast 3 talks about fetching values via URLs).
Once you have the Image URL's in your spreadsheet, it should be fairly easy to fetch them via curl or similar.
Frankly I am not sure where this would go but I assume the way to create what I'm about to describe is PHP... so I'm sorry if it's in the wrong section.
Basically I have a website, using Wordpress, where users can review websites and post a 5 star rating of the site. Each reviewed website has it's own dedicated page.
So what I want to do is create an image from the statistics of the ratings to allow the website owners to place it on their page. The image will need to, when clicked, go to that it's review page.
So the image would be about 150x150. Have the overall 5 Star Rating, the Name of the Reviewed site and the name of my website. It would also be cool if there was some automatically generated HTML embedding code so the site owners can simply copy and paste it.
I hope I explained myself ok. I've tried searching google but I'm not entirely sure what to search for and therefore have found nothing useful.
Thanks.
EDIT
I can create the embed code manually. I just need to really know about how to make the picture update automatically.
Hey what you are asking for is quite complex and it would be to easy to just give you code. So here is some reading for you. Your problem is quite meaty and no doubt you'll learn loads sorting it out....
http://www.phptutorial.info/learn/create_images/
http://www.qualitycodes.com/tutorial.php?articleid=20&title=How-to-create-bar-graph-in-PHP-with-dynamic-scaling
I just need to really know about how
to make the picture update
automatically.
Just put URL to your php-script, which will generate image.
You will need function imagettftext to write text over the image, imagecopy to draw the stars (of votes).
Also, you will need image of a star and background.
So, approx. algorithm will be:
1. Open the background image by imagecreatefrompng - it will be our generated image
2. Open the Star image
3. Write title of the reviewed site by imagetttftext
4. Draw stars by copying existing Star image to our background image (by imagecopy)
5. Print generated image by imagepng
hey guys so im building an application and one of teh features it will have is the ability to show photo links from twitter inline sort of like what tweetdeck has done in their chrome browser version and sites like crowdreel have been able to do, i spent some time researching how to grab image tags from urls on google and found this fantastic script http://www.bitrepository.com/extract-images-from-an-url.html
the script is great and does exactly what i need, however now my challenge is that the array returned from the links returns every image in the page including thumbnails ads etc, so a link to a page like this: http://lockerz.com/s/69901787
will return an array with quite a few image links to sort through,
however what i need is a link to the main image so that i can display it inline with tweets, my idea is that i run some sort of code to figure out which of the images in the page is the largest? what are your thoughts on this? is this the right method or is there something easier thats built into php perhaps? thanks for all your help guys!
Once you have the links to the images you can pass them to the getimagesize() function in PHP.
Look at this example: http://php.net/manual/en/function.getimagesize.php#example-2267
This question is a bit open at the moment as I'm not sure the idea is even possible.
So far I've loaded an image from a url, and then used jQuery UI draggable feature to allow the user to drag html text (which has been replaced using cufon font replacement) over the top of the image.
The major step (which is what my question relates to) is being able to take the image and text layered over the top of the image, and save the result, either to the server, or potentially offer the option to save the altered image to the user's HD, or what would also be useful is to upload to facebook using the facebook API, but this is something I know is possible.
It all hangs on whether it's even possible to achieve the first step, which is to save the image and layered text as a combined image?
I wonder if there is a PHP/jQuery solution that would allow me to do this?
My suggestion would be to have an internal URL that outputs the final image using jQuery and PHP, then take a screenshot using webkit2png of that page. You should know the dimensions etc., so you'll be able to crop down the resulting screenshot to just the region you're looking for.
This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
How does Facebook Sharer select Images?
When I attach a link on facebook I get the title, url, description and images from that page.
But facebook separates the images and gives just the important ones. And this is what I don't understand.
For example, attach this link
http://tonlinegames.com/
- This is a gaming site with photos about the games.
When you attach it facebook will give you as results only the image about the games, but there are a lots of other images like buttons and so one.
Facebook makes use of Open Graph meta tags when available:
http://developers.facebook.com/docs/opengraph
Otherwise, it probably just uses common sense heuristics (title, h1, p tags, large images, etc).
There is an official tool called URL Linter that displays which info facebook takes from the page. It doesn't explain which exact rules it uses but there is some useful info in the "debug" section.
there some patterns that facebook could be trying.
like the first image that is greater than 50x50 (usually the first big image is the right one).
or maybe it looks for the image that is a link to the site itself, since many sites have one like this.
probably facebook combines more than one of these kind of patterns to be more accurate.
It probably just looks at the largest images within the body of the website. I'm sure they also have some constraints on what image size can be. Too small wouldn't work when coming up in somebodies feed.