Greeting everybody,
My problem is like this: I have some custom made statistics on my internet site where I log informations on what users do ( google analytics - like ). Obviously I aggregate information a couple of months, but the Tables I store information have grown too large and have a negative impact on page loading. The flow is like this ( in index, so affect all pages ) :
1. Get the included files
2. Execute part of statistics queries
3. Effective page code
4. Execute the last part of statistics queries
To get rid of this problem I want to make those queries on <body onload="execQueries();"> or on document ready with javascript / AJAX.
How can I safely and securely make those queries using AJAX, so that cannot be abused by a client with good knowledge of javascript/ajax. Because if I simply make that JS function it can be accesed everytime by a user with firebug.
The solution I think about is including the use of $_SESSION where I mark in top of my index.php information about those queries ( id, info ) and in the script called by AJAX I check if that $_SESSION['query_info'] is set and I execute it reading all the info from there, and then I use unset($_SESSION['query_info']);. So, if the AJAX is called again, because tat specific $_SESSION['query_info'] does not exists, I do not do anything in my DB.
Do you think this is a secure solution or do you have other ideas? Anything viable is welcomed.
Thank you
Try putting your related javascript codes into Closures.
Related
I want to ask is there a way to track database table using php mysql.
I want to do something like i have a table called post. Now when user post some data other user need to view this data. That is latest one need to be view to user on the top. We can do this by refreshing div after every few sec or using ajax. But can we use Trigger. As we know it automatically fires when something is executed. Hence i want to know can we use trigger in PHP code to automatically detect changes in table. And when a new post is available it needs to return the data from database. Please give me a brief description about this. Thank you in advance.
The trigger is executed on Mysql Server, not on the PHP one (even if those are both on the same machine).
So, I would say this is not quite possible -- at least not simply.
Still, considering this entry from the MySQL FAQ on Triggers :
23.5.11: Can triggers call an external application through a UDF?
Yes. For example, a trigger could invoke the sys_exec() UDF available at MySQL Forge here: http://forge.mysql.com/projects/project.php?id=211
So, there might be a waty, actually, via an UDF function that would launch the php executable/script ; not that easy, but seems possible ;-)
Read more about it in:
https://stackoverflow.com/a/1467387/3653989
SQL trigger is a database object executed server-side.
You want a front-end technique to refresh your data without refreshing the whole page.
You can refresh your page using:
<meta http-equiv="refresh" content="5">
With PHP, you can refresh the page using:
header("refresh: 3;");
but no-one would suggest you to use such a method, because your need is refreshing the page, only after a change in your database, and not continuously.
So, if you already use PHP, you need Javascript Push technology:
Push, or server push, describes a style of Internet-based communication where the request for a given transaction is initiated by the publisher or central server. (wikipedia)
JavaScript polling, long-polling, real-time techniques, and javascript frameworks such as jquery, node.js, socket.io include a lot of practices that give you this possibility.
Good day all,
Basically what's I've got is a PHP based site tied to a MySQL database all on a local web server (nothing is being accessed from outside of the company). The index page displays an image resembling a bar chart. Employees of the company will be entering data periodically which will update the image that appears on the index page. The index page will be displayed on a couple different screens throughout the company and I need that index page to refresh after someone alters the data in the database.
I've been messing around with various AJAX solutions, but as I don't know much about AJAX I'm having trouble adapting something to work the way I need. Here's the way I've was thinking about:
-- on the index.php run a JavaScript function every minute or so that gets a response from dataChanged.php
-- dataChanged.php will query the database and get a timestamp from one of the tables.
-- the script on the index.php will then compare the timestamp to the last time the page was refreshed (or some variable that stores such information) and refresh if the data is new.
I'm somewhat proficient in PHP, but am very limited with JavaScript (and thus AJAX).
Can someone get me pointed in the right direction?
Thanks!
What you want it the standard javascript function setInterval
Have it execute a ajax call every now and then to get new data. Try to get a standars librabry that knows this stuff, probably jQuery
Depending on if the chart generation is time consuming or not I would go with different strategies. The preferred way would be to just generate the graph on each call, but if that is very time consuming I think your two-step solution works great.
Check out the link http://blog.codebusters.pl/en/auto-refresh-content-after-changes-in-database-ajax.
Here you can see how to refresh your page after change in db.
Ok what I am trying to make is a system that supports tickets. Tickets have all kinds of info on a specific job. How can I make a ticket menu with like links to tickets which contain all the info I need. For instance I click on ticket number 777 so it I have the php?id=777 in the url.
I need this page to constantly look for new tickets.
what you need to do is add a the correct ticket to an anchor
Ticket 777
Ticket 778
on the page.php you can access the ticket number using the $_GET variable
<?php
$ticket = $_GET['id'];
/// now you have the ticket number in the variable
?>
I need this page to constantly look for new tickets.
I'm not 100% sure what you're exactly looking for here, but there are two basic solutions here:
AJAX to update the content without reloading the page
Reload the page every few minutes and have PHP handle it
The first is better for things that are constantly updated, like Twitter posts, news, stock-tickers, etc, but it's a little overkill for something that updates relatively rarely (takes longer than 6 minutes or so).
The tickets should be stored in some kind of database and then read out and looped over to create the table. This would probably make more sense in a list format (stacked divs) instead of a table, but then again, I don't know the specifics.
For the simple PHP generation of links, #Ibu has a good example.
EDIT:
For more information about implementing AJAX, this page has a good example. I would recommend using a framework like jQuery or MooTools to handle the AJAX because there are some inconsistencies between browsers.
EDIT:
From the comments you made, it sounds like you are not very familiar with how PHP works.
PHP is just a templating language with some programming language features. It is best used to generate pages on the fly with dynamic content from a database.
When you try to request a .php page, you are actually telling the server to execute the code in that file. When all the code is finished, the resulting document is given to the requesting browser. The result should be valid HTML if done correctly.
On one of my pages I have users queue up search terms to be to queried from a 3rd party API. As they're building the queue, my site is making the queries in the background (through ajax) so I can cache the responses, saving them time when they submit. I store a session variable $_SESSION['isloading'] as true during the time that the background queries are running, and its false when they're done.
When they submit, the results page waits for $_SESSION['isloading'] to be false before querying the cache for result. Meanwhile they're shown a progress wheel.
Is there a name for this technique of using a session to locally "lock" a user before proceeding to the next step in the code? I came up with this approach on my own and was wondering if it is a common (or good) solution to this problem, and is used elsewhere.
Putting it in $_SESSION will be a wasted effort. Been there, done that and it didn't work out.
You will be much better off if you provide your "search query string" in as a $_GET variable for your XHR ( marketing people call it - Ajax ).
Off the top of my head, this sounds a little similar to the way some older forum software performs forum searches in the background, and the visible page does a repeated refresh until the background search is complete.
I don't think there's a name for it; I'm also note entirely convinced that it's a great solution. As stevecomrie pointed out, you're going to run into difficulties with concurrency (unless the session variable's name is unique per search query).
I'd instead recommend an XmlHttpRequest (as teresko points out, it's not really called "AJAX", ugh!) and you can handle the "waiting" quite simply with Javascript.
I asked about this on IRC (Hat-Tip to ##php on freenode), and they suggested I just make the search form and search results one page. Then, when they're done entering their searches the view would change rather than submitting to the next page. This would remove the necessity of keeping track of an 'isloading' state. This seems like a better approach to me, are there any problems with it?
I am hitting a lot of different sites to get a list of information and I want to display this information as I get it. Right now I am using a Smarty Template and what I would like to do is:
Pseudo code:
{foreach $page}
$smarty_var = use AJAX to call a PHP function
Render out a new table row on the fly w/ the newly assigned var
<tr><td>{$smarty_var}</td></tr>
{/foreach}
I don't know much about AJAX, I used it a long time ago, and it was similar to this, but not quite, there was user action taken. No I don't have a JS Framework in place. Am I way off here on how this should go? Basically I want to display a table row as data comes available, each table row will be a request to get the data from another site.
Sure, I will tell you about what I am trying to do: http://bookscouter.com/prices.php?isbn=0132184745+&x=19&y=6
If you click on the 'Click to view prices from all 43 links' at the bottom on that page you will see. I am using cURL to get all the pages I want a price from. Then for each page I want to get the price. So each page is gonna fire off a function that runs some fun code like this:
function parseTBDOTpageNew($page, $isbn)
{
$first_cut = preg_split('/<table[^>]*>/', $page);
$second_cut = preg_split('/<td[^>]*>/', $first_cut[2]);
if(strstr($second_cut[4], "not currently buying this book") == true)
{
return "\$0.00";
}
$third_cut = preg_split('/<b[^>]*>/', $second_cut[9]);
$last_cut = preg_split('/</', $third_cut[3]);
return $last_cut[0];
}
This function is called from another function which puts the price returned from the function above, the name of the company, and a link in an array to be added to another bigger array that is sent to smarty. Instead of doing that, I just want to get the first array that is returned with individual information and add the values into a table row on the fly.
I will take your advice on Jquery, what I have started is an onload function that receives the $pages to be parsed, and I was just in the middle of writing: foreach page get the info and spit some html w/ the info on the page.
Also the function that calls the function to get the price is in a php file, so I need the request to hit a function within a php file and NOT just call file.php?param1=foo, I need to it to actually hit the function in the file. I have Jquery in place, now just trying to figure it out and get it to do what I need, ugh. I am searching, any help would be appreciated.
No I don't have a JS Framework in place
Fix that first. You don't want to juggle XMLHTTPRequests yourself. jQuery is SO's canonical JS library, and is pretty nifty.
Basically I want to display a table row as data comes available, each table row will be a request to get the data from another site.
How many rows will you be dealing with? Do they all have to be loaded asynchronously?
Let's tackle this in a braindead, straightforward way. Create a script that does nothing more than:
Take a site ID and fetch data from the corresponding URL
Render that data to some data transport format, either HTML or JSON.
Then it's a simple matter of making the page that the user gets, which will contain Javascript code that makes the ajax calls to the data fetcher, then either shoves the HTML in the page directly, or transforms the data into HTML and then shoves that into the page.
You'll note that at no point is Smarty really involved. ;)
This solution is highly impractical for anything more than a trivial number of sites to be polled asynchronously. If you need rows for dozens or hundreds of sites, that means each client is going to need to make dozens or hundreds of requests to your site for every single normal pageview. This is going to slaughter your server if more than one or two people load the page at once.
Can you tell us more about what you're doing, and what you're trying to accomplish? There are lots of ways to mitigate this problem, but they all depend on what you're doing.
Update for your question edit.
First, please consider using an actual HTML parser instead of regular expressions. The DOM is very powerful and you can target specific elements using XPath.
Instead of doing that, I just want to get the first array that is returned with individual information and add the values into a table row on the fly.
So, here's the ultimate problem. You want to do something asynchronously. PHP does not have a built-in generalized way to perform asynchronous tasks. There are a few ways to deal with this problem.
The first is as I've described above. Instead of doing any of the curl requests on page load, you farm the work out to the end user, and have the end user's browser make requests to your scraping script one by one, filling in the results.
The second is to use an asynchronous work queue, like Gearman. It has excellent PHP support via a PECL extension. You'd write one or more workers that can accept requests, and keep a pool of them running at all times. The larger the pool, the more things you can do at once. Once all of the data has returned, you can throw the complete set of data at your template engine, and call it good.
You can even combine the two, having the user make only one or two or three extra requests via ajax to fetch part of the returned data. Heck, you can even kick off the jobs in the background and return the page immediately, then request the results of the background jobs later via ajax.
Regardless of which way you handle it, you have a giant, huge problem. You're scraping someone's site. You may well be scraping someone's site very often. Not everyone is OK with this. You should seriously consider caching results aggressively, or even checking with each of the vendors to see if they have an API or data export that you can query against instead.