PHP: Is it possible to save a website, simulating a manual save? - php

I understand how to use fopen, file_get_contents, etc - but I'm curious to know if there is a way to generate a "manual save" on a webpage? As if I were to physically right-click > "Save As" > to Desktop?
I'm asking this because I'm trying to learn more about securing data. I didn't think it was possible to hide source code, but I recently found out that can be done by putting the code in JavaScript. So the "hidden code" won't appear in View Source, but it will still appear in Firefox Web Developer Tools and such.
So let's say I'm using JavaScript to secure my source code (as best as can be, that is). If someone were to Ctrl+U (View Source) my webpage (or fopen/file_get_contents), they would see:
<div class="StartofSecretSourceCode"></div>
Instead of:
<div class="StartofSecretSourceCode">
<div class="Something">Some stuff and things</div>
</div>
In order for the user to see the full source code in this section, they would have to inspect the code in Firefox/Firebug/etc or manually save the webpage.
So I'm wondering if it's possible for someone to automate "manually saving" the webpage using PHP (or something else). And if so, I'm curious to know how you would do that in PHP. I hope that my question makes sense >.<

You can use guzzle library or curl to send request and save response (in the same way as browser - literally you can simulate browser) - so you can sent GET request and download page (and its assets like js files) and save it to disc from php side - it is no problem.
However if you want to protect your code dont send it to user but EXECUTE it on server side and return to user only RESULTS (in html file). This is standard and good way of code protection

Related

PHP - Don't allow download source file php

I see some websites allow users download our source include html file, css file, js file ... and almost. I don't want to do this for my website. What should I do? Thank for wathching!
P/S: If you can, please show me this approach with Zend. I'm using Zend 1.9.6.
You cannot restrict the download of resources. The browser needs to download them in order to process them, if you restrict them from being downloaded the browser wont be able to access them as well.
That is impossible. The browser need the source code to display your site, in the same way you can't prevent the user to download an image if you show it to them. The best you can achieve is obfuscate your CSS and Javascript to a hard to read scrambled code, using YUICompressor, for example. But someone determined will always be able to decipher your code logic...
As said its impossible to hide js or css files but what you can do is minify(compress) them which will make it harder to interpret by user and making your site load faster at the same time .
Check this implementation of minfy library with ZF , it provides css,js view helpers to automate the compression .
http://hobodave.com/2010/01/17/bundle-phu-compress-your-js-css-in-zend-framework/
If their browser can't read your HTML, how can it display your page?
They won't be able to read your PHP (assuming that your server is set up to parse PHP correctly), but they will always be able to read the HTML output.

Open an editor locally using the web browser and retrieve the data on the server using php?

I am struggling for days to solve this problem and it seems i cant find any good helpful guidance. So the problem is i want to implement a feature in my web application to give the users the option of editing a text on any editor they have locally and then save the file (the saved file will be online).
Suggested Solution: well my idea is:
1)to create a folder for the user locally on the browser file location.
2)open the applications using exec() (before doing so checking what kind of operating system the user use and create the appropriate error handling)
3)save the file will should be in the created file(point 1).
4)Retrieve the data from the folder.
Please advice me if u have a better idea?
What you're trying to do is impossible. PHP is server-side and it has no control over the client, it can only send it a sequence of characters to render (the page that gets displayed).
There are javascript-based rich editors such as CKEditor and TinyMCE which you can provide for the client to use, but that's about as far as you can go. Additionally, as every web browser is a little different from the other and has its own quirks and bugs when it comes to running client side javascript/DOM operations, you can expect a lot of weird little issues that happen in one particular version of one particular browser but not in others. And if the client has javascript turned off they won't see any editor at all.
In short, you can't do all that. You can do some of that. Remember, with HTTP the user is in complete control, and you can do nothing "on the user's machine". If you could, that would be called "a dangerous security exploit" and would stop working as soon as the various browsers' coders got to it.
You can send the output to the user with an appropriate MIME type, that will open the editor of your choice. You can even invent your own MIME type to do that (the user must install the editor by himself).
Then the user will save the text on his machine. You can't save on a remote machine (not in all editors), since it is not a "save", it is an "upload" that you want.
Finally the user can recover the file he or she just saved, and submit it to you via a POST form, for example.
Frankly, where I live we call this "how to put one's ass before other people's kicks". Just think of all the possible editors, each maybe with its own format: if the user (un)knowingly chooses something weird such as "Save in Word 2015 Extra Format (Compressed)", and uploads the file to your server -- are you prepared to understand the file format and do something meaningful with it?
A very common alternative is to implement any one of several Rich Text Editors in HTML - there's CKEdit, for example, or TinyMCE, and so on. They will let the user produce clean HTML and upload it on your server automatically.

Get a screenshot of a page [duplicate]

This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
Website screenshots using PHP
I have an application where people watch a stream of content (a video stream) and need to click a button each time something happens (suppose they see a red light).
I want to somehow screenshot the stream at the moment where the user clicks the button.
The problem is that the stream is not mine and I am using an IFrame to another page with the stream. The stream is a flash object.
I need to screenshot the page at the moment of click with the flash content using PHP/Javascript and save it on the server.
I saw something that seems similar to what I need but the solution is using C# and .NET.
Programmatically get a screenshot of a page
EDIT:
Idea, if anyone can explain it best here, how could I do that using a plugin/java applet or something that the user might install when entering the site. (The easier the better).
You can't get a screenshot client-side, and certainly not with PHP. PHP runs on your server, not on the client.
The only way to do this would be to write a browser plugin of some sort, or utilize Java.
See this post: Take Screenshot of Browser via JavaScript (or something else)
After a request to view your website has been made by a browser, your PHP code is run and generates the content of the page. By the time your user sees the page, the PHP has finished running, and the content of the page is static as far as that is concerned (although of course can be changed by other means, like ajax or javascript).
That's why this isn't possible at all with PHP, and it's not possible by another means for security reasons (for example, it could reveal any client-side scripts the user is running, etc.).
The only way a screenshot could be taken would be to render it yourself. The only way I can think of offhand would be to get the time on loading of the page, then the time when the button is pressed, and render the page for this long, but this is by no means foolproof and not really a valid suggestion.
I'm afraid that you'll probably have to redesign this portion of your site.
Since you say that the stream is a Flash object, you could simply read it into a Flash enviroment. Flash would allow you to do what you want - saving screenshots, or, better said, images generated by that Flash, i.e. still frames. However, I am not sure if Flash would allow you to make a screenshot of the display incl the OS enviroment.

Class or function to automatically generate PDF and Print

Anyone know any Javascript or PHP function to generate PDF and print (printer) automatically.
Excuse my ignorance, I searched on google about it and can not find sufficient documentation.
Many many Thanks Guys
What are you generating your PDF from? I presume that what you want to do is generate a PDF from e.g. a form submission, then print it on the user's computer?
You cannot print from PHP (well there are horrible ways of doing it, but don't) but I doubt it would help you even if you could - it would be printed on the server side, and I imagine you would be wanting to print on the client (i.e. browser) computer.
You can generate PDF's in PHP (have a look at FPDF) and send them to the browser, and you can print a web page in javascript, but to combine the two would be tricky, if it is possible at all. You certainly can't do it without prompting the user.
If you were to generate the PDF, then open it in an iframe, you could maybe call something through javascript to prompt the user with the standard printing options dialog, but that would be as far as it goes. It wouldn't work everywhere, if it worked anywhere, which I somehow doubt.
to create PDFs, theres the great FPDF-library.
printing automatically fortunately isn't possible - just imagine this would be possible and every f***g website could (in addition to the annoying popups and stuff) print out something (advertisements most of the time) on your printer.
EDIT :
if you have control over the clients, you could write a little batch-script like (not tested)
AcroRd32.exe /t %1 printername
and then set pdf-files in your browser to open automatically with this "programm" wich should then print the file without a print-dialogue.
note that you need access to the clients for this and it isn't tested. in theory this works: i did something very similar once to print out labels directly from the browser, but this was a few years ago using WinXP, don't know if this still works on Win7 (or whatever you're using).

Curl preg_match

We are downloading images to our computers when we open new webpages. For example: If a webpage has an image(image.jpg), our computer downloads it while we are surfing that page.
Some webpages are using ajax methods. For example: You don't see an image on the page's source codes, however your computer downloads an image. Because, if you click a link on that page, ajax will be showing that image...
Let me show an example:
<div id="ajax_will_load_image_here"></div>
Okay, how can php curl see (or download) that image? Curl can't see that image when I try to use preg_match function. Actually there is an image. I want to download that image by using php curl. Any advice?
If i understand the question correctly there is no convinient way of doing that.
Your crawler/spider would have to parse the website and evaluate javascript.
There are libraries for that but support is very limited.
There are however methods where an actual browser is used to evaulate the page (without displaying it but setting proper environment variables like resolution etc).
Then the generated source including javascript dom modifications is available.
This is for example how the google search previews are generated.
But if you require user interaction it gets pretty specific and complicated.
I am sorry to dissapoint you, but using curl and preg metch the old school way we used to when javascript was not yet so common wont work.
However for most legit use cases this is more than sufficient and websites are today more and more designed to be non-javascript compliant. Especially the content for crawling purposes. It is a must in search engine optimization, and which website doesnt want that?

Categories