I have read a few similar questions and answers, but none completely address my issue.
Here is My Scenario:
I have what is similar to a tinyMCE (a home-brew version though) kind of editor. It lets users enter some text, and an image or two, etc. I have code that takes the items in there and renders them into a smaller div (what is essentially a thumbnail) in real time.
Here is What I Want to Do
Ultimately, the user may want to use their 'page' somewhere else, so I would like to let them go to a screen, view thumbnails of each page, and pick one.
Here is the Problem
Obviously, I could just use the same thumbnail code to render each page thumbnail. However, it can be bandwidth intensive (each page could have several images, not to mention the calculation would have to be performed many times - we are talking perhaps 40 to 50 thumbnails on a preview page).
So, I wanted to try to take the thumbnail div, and somehow create a png or jpg when they save the page in the editor (so the code for the page, and also a thumbnail image), and push it up to my PHP script to save the image to the server.
My first thought was that maybe canvas could do that, but there is the issue of translating the text and images onto the canvas first, which may or may not be possible.
So there it is. I am interested in any and all options, including commercial libraries if available that will do this -- only thing is, would like it to be in javascript.
You may want to look at:
http://html2canvas.hertzen.com/
A similar question was already asked:
Screen Grab with PHP and/or Javascript?
Related
I am building an avatar-generator for a PHP/MySQL site I am working on. It uses CSS to layer multiple .png files to create the background, body, facial expressions, etc. for a user's avatar. This I have covered.
I want to add a feature to my site that will allow the user to download their layered avatar "image" as one .jpg file. Is this even possible? I think I have seen this functionality before but can't recall the site where I saw this now.
Of course, I could come up with a series of pre-generated files that would cover all of the computations possible with my images, but with somewhere around 200 objects to choose from and a maximum of 10 layers of choices, the number of permutations possible is somewhere around 8.14702044e+22! Obviously, this is possible for me to do but I would be old and gray before completing the task!
Poking around the Internet has led me to believe there might be some way to "screen cap" - with what software and if it can capture a small section of the screen I don't know. Besides, would this bog down my site (which is currently running at top speed)?
I've searched through Stack Overflow for similar questions but didn't find anything that addresses my problem specifically. That said, I am not certain what to even search for (the precise terminology) as this concept of layering and saving as one image is foreign to me.
I found a solution to my own problem. It's not quite what I had in mind when planning this part of the project, but the end result will be close enough (with some other modifications).
I will implement html2canvas (http://html2canvas.hertzen.com) to take a screenshot of the user's avatar when they have pressed "Save". I will store the resultant image to my server and this will be their avatar. Their selected variable data will be stored in a database so that they can load up their avatar at a later date and make changes to it.
i am developing a cake-php application, under this i want to show powerpoint slideshow for the end user but the condition is that the user can only be able to see the show, not be able to download the slideshow.
Can any one plz suggest the best way to do it.
If the slideshow is based on images you can split each image into 9,16 or more squares and display the tiled image. That way if the user decides to 'Save-as' the image he will get only 1/9, or 1/16-th of the real image. If the slideshow is quite big it will be a pain to put together all the pieces and will discourage the users to try and save the slide.
You can see such implementation here - http://whatismycar.com/info/16540/ - the 4 images below the header are in fancybox and if you try to 'Save-as' one of them you will save only a small tile of the original image.
Hope this helps.
It is impossible prevent downloading images from internet, but you can make it hard for users with this. Also you can hide source html image path with php check it here
While I am no expert on this subject, something worth noting is what Youtube seems to be doing.
Ever notice how the whole video never loads if you pause it?
Upon monitoring the network tab during a video you will see that they are actually making hundreds or even thousands of requests for video segments from their server and most likely using JS to clear the cache of parts you've watched.
^^ this is why going back to an earlier point in the video causes it to stall for a bit while it re-downloads the segment which you wish to see.
At the end of the day, PrtScn will trump all of your efforts because the web browser does not have the privilege to control the keyboard outside of it's own environment.
I am looking for a way to create functionality, similar to when you post a link to the existed web-site in facebook. If this statement is rather ambiguous, I will try to elaborate.
When you paste your link and submit your post, facebook together with you link gives a small preview of the page, you are posting (text and may be a small image)
What are the ways to achieve this?
I read the similar post, but the thing is that I do not need an image so much, text will be sufficient.
Working in PHP, but language is not important, because I am looking for a high level idea.
Previously I was thinking about parsing content of the link with cURL but the thing is that in a lot of situations the text returned by facebook is not available on the page.
Is there other ways?
From what I can tell, Facebook pulls from the meta name="description" tag's content attribute on the linked page.
If no meta description tag is available, it seems to pull from the beginning of the first paragraph <p> tag it can find on the page.
Images are pulled from available <img> tags on the page, with a carousel selection available to pick from when posting.
Finally, the link subtext is also user-editable (start a status update, include a link, and then click in the link subtext area that appears).
Personally I would go with such a route: cURL the page, parse it for a meta tag description and if not grab some likely data using a basic algorithm or just the first paragraph tag, and then allow user editing of whatever was presented (it's friendlier to the user and also solves issues with different returns on user-agent). Do the user facing control as ajax so that you don't have issues with however long it takes your site to access the link you want to preview.
I'd recommend using a DOM library (you could even use DOMDocument if you're comfortable with it and know how to handle possibly malformed html pages) instead of regex to parse the page for the <meta>, <p>, and potentially also <img> tags. Building a regex which will properly handle all of the myriad potential different cases you will encounter "in the wild" versus from a known set of sites can get very rough. QueryPath usually comes recommended, and there are stackoverflow threads covering many of the available options.
Most modern sites, especially larger ones, are good about populating the meta description tag, especially for dynamically generated pages.
You can scrape the page for <img> tags as well, but you'll want to then host the images locally: You can either host all of the images, and then delete all except the one chosen, or you can host thumbnails (assuming you have an image processing library installed and turned on). Which you choose depends on whether bandwidth and storage are more important, or the one-time processing of running an imagecopyresampled, imagecopyresized, Gmagick::thumbnailimage, etc, etc. (pick whatever you have at hand/your favorite). You don't want to hot link to the images on the page due to both the morality of it in terms of bandwidth and especially the likelihood of ending up with broken images when linking any site with hotlink prevention (referrer/etc methods), or from expiration/etc. Personally I would probably go for storing thumbnails.
You can wrap the entire link entity up as an object for handling expiration/etc if you want to eventually delete the image/thumbnail files on your own server. I'll leave particular implementation up to you since you asked for a high level idea.
but the thing is that in a lot of situations the text returned by facebook is not available on the page.
Have you looked at the page's meta tags? I've tested with a few pages so far and this is generally where content not otherwise visible on the rendered linked pages is coming from, and seems to be the first choice for Facebook's algorithm.
Full disclosure upfront, I'm a developer at ThumbnailApp.com.
It's an JSON API service with an optional Javascript SDK which I think does exactly what you're after: It will parse a string to detect any urls and return the title, description and thumbnail of the asset. If the page has OpenGraph tags, it will use those for the image thumbnail. It's currently in private beta but we're adding more accounts each week.
If you feel that you really need a do-it-yourself solution:
Checkout the python based Webkit2Png and the headless browser PhantomJs. They can render webpages to an image (default size is 800x600), then you'll have to write some code to resize and crop the image like taswyn mentioned. Ideally you would then upload the resized image to Amazon S3 and then get it hosted on a CDN such as CloudFront.
To get the title and description, first get the URL content (cURL or whatever) and you will need to check the content-type header to make sure it's a webpage. If it is, you can then use a HTML parser such as the SimpleHTMLDOM PHP library to grab the title and description meta data. If you want it exactly like Facebook you will also need to check for any OpenGraph tags specifically the og:image tag.
Also don't forget about caching. The first render and description parsing can take a long time. Even if your site is fast, the webpage you're rendering could be slow and the best approach is to render / parse it once, then just save and return the resized image and meta data for subsequent requests. Depending on what your requirements are you may need to refresh the cached data every hour or you could get away with refreshing it once a day.
To do it yourself takes quite a bit of work and lots of server configuration. I feel using a 3rd party service is a better way to go, but obviously I have a biased opinion :)
Okay, I know questions like this exist in multiple forms across StackOverflow and other places on the web, but none of them is pointing me to what I actually need (maybe I missed a question that was more catered to my problem).
I need a Facebook-style image upload mechanism, using Codeigniter and javascript/jquery. Here's what it's supposed to do -
Using a single file upload control (or for that matter any clickable control), open up the "Choose Files" dialog window, and allow the user to select multiple images from it. (I know this cannot yet be done in IE, and I do not really care about the multiple file selection not working in IE).
Once the user has selected the files, the page should display a series of progress bars (like Facebook does). As each image gets uploaded, the corresponding progress bar reaches 100% (if it's simpler to implement, I am willing to forego the graphical progress bar for a text that displays the progress percentage), and the thumbnail of the image is displayed next to the completed progress bar (or text). At this point, the user should have the option to delete the uploaded image by clicking a cancel button (I think I can get this part working on my own).
The upload can be sequential (like Facebook does), or asynchronous (some upload libraries I found work this way).
What's most important (and the part that is stumping me) is the thumbnail generation. I know that there's some HTML5/CSS3 technique that allows you to display the thumbnail before the files have actually been uploaded, pulling them directly from the user's hard drive. But that won't work in IE8, and while I am not concerned about the multiple image selection not working in IE8, I need for the thumbnail generation to work cross-browser, and that includes IE8 (deciding on the browser compatibility is not something I can command, so please don't come up with a "screw IE!" solution).
I have tried using uploadify (I have no constraints against using Flash), but cannot seem to able to customize it to my needs. While uploadify does indeed display progress bars, I was unable to find a way to generate (and display) thumbnails on the fly, in accordance with the behavior I described above. I know how thumbnail generation works on PHP, just cannot figure out how to implement this together with the progress indicators. Am I looking for a suitable jQuery/ajax call?
Any help and/or pointers would be appreciated. I admit that I might have missed a StackOverflow question that would solve my issue, so please direct me to that page, or to any other page you believe will help me. Please feel free to suggest upload libraries other than uploadify, which you believe I might find useful.
Thanks in advance. And thanks for reading through all this - I tried my best to make the question properly descriptive!
I have used Jquery file upload with good results. It does need IE to be in compatibility mode, but worked well for chrome/firefox.
UPDATE. It now claims to support IE 6.0+.
I'll focus on point 4 here. It can be done, but you'll end up using iframes (yeah, I now).
They can be 1px small, but you'll need them if you want to create thumbnail previews.
A good starting point would be here: http://www.zurb.com/playground/ajax_upload
As for creating the actual preview images (smaller versions) you can use CI's image library.
Let me know how it works out.
So I just put up a website for my high school and realized a lot of images are stretched especially on this page
www.eriesd.org/central/central2/staff.php
What you be the best way to make the images not so stretched?
I was thinking of adding a div and adding background image with center center or 50% 50%. Also on the career and tech pages I noticed the info page doesn't load in IE but the other pages load fine has anyone else ever had this problem?
I'm basically getting the location in menu and option and calling an ajax request which loads 1 of the 3 layouts which connects to my database and gets information depending on the option and location.
Assuming that the pictures are uploaded using some kind of php CMS, the first thing I would do, is process the images correctly at the moment they are uploaded: Apart from the bigger image, you would need to generate a thumbnail that fits the size you need for that page.
I would also recommend adding a notice to people who are uploading a picture, that this specific picture needs to have a landscape format as that is what you are using on the page.
CSS solutions would be my last resort to iron out small issues.
Edit: Apart from that I would seriously reconsider publishing all e-mail addresses like that and add some pagination as the page now takes a long time to load (especially with all the images being a lot bigger than you need them to be...).
They are stretched because you specified both width and height attributes for the <img> tag. If the actual image is of different dimensions, one can see how the browser has no options but to distort the image to make it fit the specified height and width.
Just don't specify either height, or width, or both, and the images are going to be ok.
You should set the height only on the img and add the width:200px;text-align:center CSS to the anchor if you want the white area either side. Omitting the width will shrink the whitespace around the image.
<a class="image" style="width:200px;text-align:center">
<img src="http://www.eriesd.org/central/central2/images/staff/kranking.jpg" alt="missing photo" height="112">
</a>
I'll answer your first question concerning images. The real problem is that your images are not sized to fit the space you want them to fill. One of them that I inspected was a 6MP (2848x2144) image weighing in at 1.5MB. There were many more of this approximate size and dimensin. Any one of those images is larger than the entire page should be by quite a lot. The first step is getting images to the size you need them to be. Your page is nearly 19MB. Not only so most browsers do a lousy job of scaling images, you're sending a ton of extra data and making the page load very slowly for users without very fast connections. Imagine a user with a mobile browser waiting on this and chewing through their data plan! A user with DSL might need several minutes; dial-up could require hours.
If you're uploading them manually, resize them first. Figure out a size constraint and resize and crop first. If you're using a CMS, find settings, plugins, or customize it to make a smaller thumbnail version and use it.
To keep the layout looking nice and equal, the only thing you can do is either stretch them as it is now, or, even better, crop the images a bit and resize them. You can probably do it programmatically for most of the pictures, just assume that the top center is where their head will be. You have stretched picture issues all over the site though.
As for the Career & Tech pages, they're still actually being loaded (at least in the latest IE) if you look at the source, but they're not being shown for some reason, so, either you have some CSS or JavaScript issues with .post or .content. It even pops up for a second sometimes and then disappears.
If you specify only a width, the height will be set proportionally and thus prevent stretching of your images.