i have an url="http://some url";
Is it possible to create an image of the url using php?
I tried using imagecreatefromjpeg but it accepts only image file as the input and not the url like "http://"
I'm not sure what you mean - do you mean create an image of the page
itself? Then yes, it's possible. All you need to do is parse the html,
fetch any css and parse it, add in images and process any javascript in
the page.
Of course, it might take you a few years to build such an application,
but it can be done. And at the end you'll have a browser written in
php, which will be quite slow.
If you need to take a snapshot of a web url using just php you need an external tool like cutycapt.
It is quite invasive for a server (you need an X environment) but it the easiest solution to go at present.
If you want to create an image with your url text in it then use something like, imagefttext, for more visit php.net/imagefttext, in case you want to take a screenshot of the webpage at given url, go to this SO link: Command line program to create website screenshots (on Linux)
Related
I was trying to figure out a way to have JQuery (if possible) to display the latest image from a directory, regardless of its file name. Basically, I just want a simple tag to be updated to show the image. I realize this might not be possible with Jquery, so my fallback would be PHP.
I've been trying to research how to do this, but honestly do not know where to start. Any tips to get going would be greatly appreciated,
If you want to do it with JQuery, you can! You only need to add a few lines of logic to your app.
For example, when you add images to that folder, also write a text file with the name of the image just added. Then this text file is going to always have the filename of the last image uploaded. Finally, using JavaScript(with JQuery or anything else) you do one request to the text file and then you do the second request to obtain the image with that filename.
That is just a very basic example to try to share the idea of how to do it, then depending your particular case, you can add or change a lot of details to make this works for your case.
BTW, to do it with PHP, you are going to need to execute a system command, for example 'ls -ltrh' and parse the output to obtain the last filename.
You can use PHP script to get latest modified file in a specific directory.
Code can be found here: Get last modified file in a directory
You then can format that into JSON which can be passed through a jQuery AJAX call http://api.jquery.com/jQuery.ajax/
With those 2 techniques, you can manipulate the data to display the image.
I have am planning to use Colorbox the jquery lightbox plugin. I am just wondering as to what the best method would be to implement a 'add/upload image' option so it would be easy for a user to upload new images without having to go into the HTML markup and add an individual <img> line of code for each new image.
I am only currently knowledgeable with HTML and CSS at this current time. So if no one already has a snippet they could possibly provide me with, which language would be 'simpler' to learn for just for this specific task. Would it be PHP or Js or something different?
OR am I going about this completely wrong?
Of course they have to upload images to database. You need a suitable back end design. That means you need php to interact with database(or python will also do). If you need dummy upload procedures then you can easily do it with appendTo() method using jquery.
As you are using a plugin, then you will have a method to add an image to the webpage. It will be something like $("user-uploaded-image").plugin-name();
Where you want to upload the image?
Uploading an image means transferring the image to another location (server). For that you need a server side script that
Receives the image.
Saves the image to server hard disk.
php is a server side scripting language.
JS is client side language that is used to sending requests (in this case image ) to server.
Just google it for getting sample image upload scripts.
From taking a look around google for upload scripts suggested by #kiren Siva I came across quite a few CMS type plugins that give me the ability to do exactly what I need.
Some examples can be found here.
Thank you all for your help and suggestions.
I'm trying to capture some images from an old database.
When writing scrapers, I use ruby (but am comfortable with php as well) to directly open() a website and read its contents. I sometimes also use the script to call the appropriate curl ... command.
However, the database I'm scraping some pieces out of returns a page and then embeds the target image with an image name using a series of random numbers I assume by the server side script. For example:
<img ... show_image.jsp?343523.jpg
However, I cannot call this show_image script directly (denied), it only works when embedded in the website as a whole.
Can I use curl, or within ruby or php do something download the entire page, for example, 1929.2.14.aspx in such a way that it includes the embedded image generated by show_image.jsp?343523.jpg?
If I simply curl the aspx file directly, I naturally just get the html - how might one save both the html and embedded image via scripting in the manner that a browser-based "web archive" feature works manually?
Any tips, links to tutorials, etc. appreciated...
You should probably be using mechanize to scrape websites in ruby. When you do it will set cookies and referer for you so getting the image will be as easy as:
agent.get(image_url).save_as 'local_filename.jpg'
If the script (show_image.jsp - for example) is doing a simple referrer check, you may be able to work around it by writing your PHP (or Ruby) scraper in such a way so as to set the referrer before making the GET:
curl --referer http://www.example.com http://www.example.com/show_image.jsp?bar.jpg
I've got an idea for a site that would generate png or jpeg screenshots of webpages on the fly. The end user would never see the pages, but the HTML would be turned into a screenshot instead and the end user would see that screenshot.
How can I get started on this? I guess what I'm looking for is some kind of PHP function that takes the HTML as an argument and then produces an image file in a specified location.
As far as I know, PHP does not do this.
You can, however, find a solution using external tools.
Here is how I would do it
Generate the HTML
I would pass this HTML to a external tool using exec. There is, for instance this one.
Then, display the generated picture
http://www.zubrag.com/articles/create-website-snapshot-thumbnail.php
You have a forum (vbulletin) that has a bunch of images - how easy would it be to have a page that visits a thread, steps through each page and forwards to the user (via ajax or whatever) the images. i'm not asking about filtering (that's easy of course).
doable in a day? :)
I have a site that uses codeigniter as well - would it be even simpler using it?
assuming this is to be carried out on server, curl + regexp are your friends .. and yes .. doable in a day...
there are also some open-source HTML parsers that might make this cleaner
It depends on where your scraping script runs.
If it runs on the same server as the forum software, you might want to access the database directly and check for image links there. I'm not familiar with vbulletin, but probably it offers a plugin api that allows for high level database access. That would simplify querying all posts in a thread.
If, however, your script runs on a different machine (or, in other words, is unrelated to the forum software), it would have to act as a http client. It could fetch all pages of a thread (either automatically by searching for a NEXT link in a page or manually by having all pages specified as parameters) and search the html source code for image tags (<img .../>).
Then a regular expression could be used to extract the image urls. Finally, the script could use these image urls to construct another page displaying all these images, or it could download them and create a package.
In the second case the script actually acts as a "spider", so it should respect things like robots.txt or meta tags.
When doing this, make sure to rate-limit your fetching. You don't want to overload the forum server by requesting many pages per second. Simplest way to do this is probably just to sleep for X seconds between each fetch.
Yes doable in a day
Since you already have a working CI setup I would use it.
I would use the following approach:
1) Make a model in CI capable of:
logging in to vbulletin (images are often added as attachments and you need to be logged in before you can download them). Use something like snoopy.
collecting the url for the "last button" using preg_match(), parsing the url with parse_url() / and parse_str() and generating links from page 1 to page last
collecting html from all generated links. Still using snoopy.
finding all images in html using preg_match_all()
downloading all images. Still using snoopy.
moving the downloaded image from a tmp directory into another directory renaming it imagename_01, imagename_02, etc. if the same imagename allready exists.
saving the image name and precise bytesize in a db table. Then you can avoid downloading the same image more than once.
2) Make a method in a controller that collects all images
3) Setup a cronjob that collect images at regular intervals. wget -o /tmp/useless.html http://localhost/imageminer/collect should do nicely
4) Write the code that outputs pretty html for the enduser using the db table to get the images.