Javascript setInterval function - php

I have a page where it shows users posts and refreshes automatically using the jQuery setInterval function.
$(document).ready(function(){
setInterval(function() {
$('#content').load('test.php');
}, 5000);
});
But the problem is I am going to have to create a duplicate page called test.php containing the same content which will be called every 5 seconds. I don't want people just viewing the source and finding the page with all the data on.
For example this site has a recent forum topics page which updates every couple of seconds,
http://awesomescreenshot.com/0d4o0n2e0
I look in the page source and find the link to the page and this is what I find
http://awesomescreenshot.com/0a2o0n691
I don't want people to be able to find that...
Is there a better way round this jQuery function? E.g. calling a php function to just run the query which will be in the test.php file?

Thinking about security by thinking where the data is going isn't quite right. Instead think about who has access to it. If you don't serve that data from the PHP to someone who shouldn't see it in the first place, then it doesn't really matter how they view it.
So your test.php needs to have security around it that hooks into your authentication. In psuedocode:
if (current user is authorized)
send data
else
403 Access Forbidden
Security through obscurity will only hurt you in the long run. Even if you could obscure the location of that data, it leaves open the possibility that someone may find it eventually. So do the security on the backend, out of reach of hackers, instead.

Related

How to save an html static page using PHP on Wordpress

I have a wordpress site. In my site there I have this one specific page that shows a lot of content. This content is based on many custom post types. I have built the page by writing a specific php file for that page called patio; i.e. page-patio.php.
The problem is that since the logic is complex it takes the server about 30 seconds to respond, I have optimized images and everything else that loads at the time; but I see that what takes too long is the server response.
I could try to optimize at server level, but I am seeing that it does not make any sense that all that complex logic and database reading should by done every time a user wants to display the page. The data changes once a day, maybe more often in the future.
I want to run a cron that executes a php snippet. Then that PHP snippet would prepare the page, i.e. write the html. So when a user clicks on the page I should just show that static html page and with javascript let him navigate the content.
Anyone found a solution for this?
Good morning.
Is it a server problem, or number of CSS, JS and cookies? Is it a shared server?
Maybe you change your Theme for one that use pure javascript, or a few libraries JS and CSS be better. Be right about problem: if it's with server, or number of libraries and cookies, or both situation. However, if you can't change anything, you can contract a better server service for your application.
I gave up using wordpress to priorize page performance.
I was able to solve my issue very nicely, thanks to this 10 year old post:
Link to Stackoverflow post on saving to html page
I created a cron event that runs periodically that runs a PHP Snippet; I use the Code Snippet plugin to create the event:
if ( ! wp_next_scheduled( 'iKid_cron_twicedaily' ) ) {
wp_schedule_event( time(), 'twicedaily', 'iKid_cron_twicedaily' );
}
add_action('iKid_cron_twicedaily','iKid_Static');
And then the same plugin to create the iKid_Static function.
This function uses the ob_start() and file_put_contents('yourpage.html', ob_get_contents()) commands to write the html page on my server.
And then on my actual page I code:
$content = file_get_contents($cacheFile);
echo $content;
This way my users now get all the information on that page in just 2 seconds, down from about 30 seconds.
I will be able to improve this surely when I move to a better server, but for now that is a great improvement for me.

Save URL of AJAX loaded page, so it can be loaded after a refresh

We have an application writted in PHP. Its main view is for example: /pages/index.
Now when the user clicks on certain links, it pulls in other Views via ajax. ie. a call may look like /pages/publish, so the PHP outputs the relevant html for the publish section back to the index view.
The problem we have is we'd like to be able to give the user the option of refreshing and seeing the same view as before. So, my initial thought is this, when we use .load() in jQuery, to take the URL its going to load and store it somewhere to be read by the PHP if the user refreshes. Is the best way to do or can someone think of a better way to do this whole thing?
Check out jQuery.address which should solve your problems! It allows AJAX loading of new pages, and will update the address bar accordingly. If a user saves this URL and reloads it, the script on the page can then load the correct page.
Alternatively, if you're HTML5-only, then you can try history.pushState() which will modify the URL without using the hash symbol, but support isn't 100% yet. (I don't think... it certainly behaves oddly on iPad from my experience.)

ajax authentication in a div

Hey all..here a question from GJ in Holland.
I am busy with my first AJAX web programming and really like the idea where one php file (index) is loaded and from there xmlhttprequest are able to load and refresh content of the div's without refreshing the page.
Things are running good so far and about 4 div sections get different contents depending on which menubuttons you press (all through getdata functions and xmlhttprequests).
My last step is to integrate an extra autenthication div. I am trying to implement a nice jquery fade in fade out system with a login.php with the input fields for user name and pass; a process_login.php which compares the data with mysql and returns if theres a match or not; and finally a secured page where the user can logout when succesfully authorized.
These pages seem to work seemlessly when i load the login.php directly in browser.
When i use getdata and xmlhttprequest on the login.php to load it into a div section on index.html nothing works anymore because it seems it can't use the functions anymore which are declared on the login.php page.
Reading ajax for dummies doesn't give me any answers although i am sure there must be an easy to understand logical explanation for this fact.
I can't get my head around it..please any info is welcome...greets
GJ
Javascript loaded through ajax does not become part of the window. You have to explicitly execute it (e.g. using eval). There's no direct solution to this problem, so you need to come up with a model for your application to know about the resources that are needed by something it loads through ajax.
The best way to do this is to create some application-wide convention - e.g. set up a cross reference of pages & script files, and use $.getScript to load them on demand. Ideally you would check to see if a resource is already loaded before trying to load it again.
Here's a simple idea you could use. In the output of your login.php add a tag at the top, e.g.
<span id="script" style="display:none">login,/scripts/login.js</span>
Then after an ajax call that loads a page, do something like this:
data = $('#wrapper').find('#script').html().split(',');
if (!window[data[0]]) {
$.getScript(data[1]);
}
So basically you're passing some info in the HTML that the loader uses to figure out what it needs. The first parameter is a namespace, so you can check if it's already loaded. The 2nd is the path to the script.
You could flesh this out to account for more than one script, use JSON for the data format, etc.. but this is a basic idea.
Yeah, you could always just include all your scripts up front, too :) however loading on demand is a good idea for any nontrivial application, so you don't clutter things up with scripts you don't need. The login script's only going to be needed once per session after all.
As to why.....I dont know why this behaves so.
However as to a fix/workaround. Im in a similar situation currently where im loading in pages (actually asp/jscript rather than php). What ive discovered is that the scripts you write in the page thats being loaded in, are not available anymore when loaded through AJAX. I have experienced the same problem if the page being loaded contains an applet or other html object type of tag.
A solution to this is to move your scripts to an external file on the server, from there your page will be able to reach them regardless of whether it was loaded by AJAX as a panel or is a standalone page
Example: (this is obviously jscript rather than php but the loading will be similar.)
Page login.asp contains in <head>
<script type="text/javascript" src="scripts.js"></script>

Script to insert data on different domain

I am thinking about writing a script that will perform a sort of checkout procedure automatically similar to a program like Ebay snipe.
I will know what the page exactly looks like. All I really want to do is load the page from a different domain than the one that is running my script into an iframe, have jquery insert the data into the appropriate fields and then use javascript so click the submit button.
I have been reading about security issues with accessing information across different domains. On the domain I am trying to submit to I would like to call a few jquery functions such as .find() to get the id of the submit buttons so I can programatically click on them.
This might sound malicious or something which its not there is something going on sale that will sell out quick and I will not be around to click refresh one hundred times to try and buy it. I figured it would be a cool project to make a script that buys it for me.
Anyway my first question is, is this possible? Secondly, what would be the best way to solve this problem? I was going to use PHP/Javascript/Jquery. Will this even work/be allowed. Also if anyone has any other information that might help me out that would be great. Thanks.
Not going to happen... The only way I think you can accomplish this, for yourself, is by inserting your code with FireBug (or the like) on a per-use basis, or perhaps in a GreaseMonkey configuration... but it's not something that you could publish so that others would get the same functionality just by going to your page.
In Firefox, you can create a bookmark that runs JavaScript (instead of navigating to another page). So, now you can inject any javascript into your own page.
With this info, you can load jQuery from another domain along with any other scripts and automate whatever you like.
This only works for you, the person with the special bookmark, but you can hand the bookmark to others for their use.

What can cause a double page request?

I am currently investigating a double request problem on my site. Not all the time, but sometimes, a requested page will in fact load twice...which is not a problem really until it is on a page with PHP that inserts stuff into my db on request (my tracking script).
I have read that an empty src in an image tag, and an empty url() in a css background could potentially cause the page to be requested twice.
However, I can't find any problems with those.
Is there anything else that could be causing something like this?
ANSWER FOR MY SITUATION
After some extensive research, it turns out that in my case specifically, the second request has been coming from the user agent "Mediapartner-Google". I began to notice that on pages that serve an Adsense ad, I could expect a secondary visit from this crawler within seconds after I visit the page myself.
This doesn't seem to be happening on pages without Adsense ads.
I am going to mark an answer below, because it seems like for most situations, those are the correct things to check.
I have sat beside people whom I would swear knew better than this, and watch aghast as they double-clicked on every hyperlink in our app.
Didn't take long to figure out why they were experiencing double the page load time of everyone else...
Things like this certainly tend to give one pause when implementing pages that change the backend state. A lot of people put sequence numbers in hidden form elements so the backend can detect a double-submit.
The causes I've seen before:
Missing stylesheet or image
Web developer addon for Chrome/Firefox sometimes requests things twice if you're validating HTML etc.
Browser inconsistency
Sometimes it's just too difficult to track down the root cause of a double request.
Either way, you should NOT be changing database state (or session state) through a GET request. The only SQL query you should be running without postdata is SELECT. All updates and inserts should be done using forms, even if the form consists only of a submit button.
src="" in certain elements on certain browsers (such as <img src="" />) can request the current page again.
404's are a prime source for a request seemingly being requested twice. Check your CSS, JS and image sources are all correct.
We had a very strange behaviour in our CMS where an iframe in a jQuery dialog lightbox made a doubled database insert.
After hours of debugging and loud WTFs we nailed it down. the dialog close method was setting the focus to the iframe of the dialog before destroying it and caused a reload of the iframe url!
I have seen this countless times. The internet is full of strange people who keep double-clicking on everything they come across.
You can stop this in you web site by attaching a global double-click event listener to every anchor tag ( tags).
For example, if you have jQuery installed, you can do the following:
jQuery('a').on('dblclick', function(e) { e.preventDefault(); });
This is just an example of course. You can achieve the same result using vanilla Javascript.
That should silently ignore the double click action.
In case they are fast clicking twice instead of double clicking, then you can use can throttle the click handle on all the links in the page to ensure that they cannot be clicked more than once within say ... 3 seconds.

Categories