Ok what I am trying to make is a system that supports tickets. Tickets have all kinds of info on a specific job. How can I make a ticket menu with like links to tickets which contain all the info I need. For instance I click on ticket number 777 so it I have the php?id=777 in the url.
I need this page to constantly look for new tickets.
what you need to do is add a the correct ticket to an anchor
Ticket 777
Ticket 778
on the page.php you can access the ticket number using the $_GET variable
<?php
$ticket = $_GET['id'];
/// now you have the ticket number in the variable
?>
I need this page to constantly look for new tickets.
I'm not 100% sure what you're exactly looking for here, but there are two basic solutions here:
AJAX to update the content without reloading the page
Reload the page every few minutes and have PHP handle it
The first is better for things that are constantly updated, like Twitter posts, news, stock-tickers, etc, but it's a little overkill for something that updates relatively rarely (takes longer than 6 minutes or so).
The tickets should be stored in some kind of database and then read out and looped over to create the table. This would probably make more sense in a list format (stacked divs) instead of a table, but then again, I don't know the specifics.
For the simple PHP generation of links, #Ibu has a good example.
EDIT:
For more information about implementing AJAX, this page has a good example. I would recommend using a framework like jQuery or MooTools to handle the AJAX because there are some inconsistencies between browsers.
EDIT:
From the comments you made, it sounds like you are not very familiar with how PHP works.
PHP is just a templating language with some programming language features. It is best used to generate pages on the fly with dynamic content from a database.
When you try to request a .php page, you are actually telling the server to execute the code in that file. When all the code is finished, the resulting document is given to the requesting browser. The result should be valid HTML if done correctly.
Related
I am trying to show the list of online users in my application. Let me explain my requirement.
I have a Mysql DB table where list of username and their status mode(either 1 or 0) are storing. I have two php pages. One is for listing down all user's name and the status mode. Second page is for editing the mode of users from 1 to 0 and vice versa.
Now I open these two pages from different system. If I change the status of one user(edit page) from one system then automatically it will reflect to the another system, where the listing page is opened, with the updated record and obviously this should happen before refresh the listing page. The same like gtalk chat users.
I am not asking the code, but please help me how to proceed to resolve the issue. Obviously, cronjob is one of the solutions, please provide another solution.
Thanks in advance.
Well cronjobs are in fact not the thing you need.
With cronjobs you can schedule a task. What you want is client side refresh when new info is found. While cronjobs are server side and always on an interval.
What you need is polling or commet
The first, polling, you use your client side to execute a script every x seconds and look if there is new info (waste of resources in my opinion).
Commet, is now a days a better solution. But often hard to implement. I used pusher for this type of stuff. You can push messages to (all) clients connected and say there is new info. And then they will update or with the message comes also the new info
To achieve something like this, you should use JavaScript and Ajax in the clientside.
Give the XMLHttpRequest a try. To make it easier you could use something like jQuery.
On the serverside you could use json to transmit the data.
Read the data from the table and put it into an array, let's call it users, the keys are the names and the values are their mode(1 or 0).
Then use json_encode(ARRAY):
//Echo the results in json format
echo(json_encode($users));
Let's say, the users 'Frank', 'Susan' and 'George' are online and 'Isabell' and 'John' are offline. Then the script would result in an output similar to this:
{"Frank":1,"Susan":1,"George":1,"Isabell":0,"John":0}
Of course you need to put this and the loading into another php-script, maybe refresh.php.
And, to read the data from the script, add some JavaScript to your view page.
Use the XMLHttpRequest to request data from the script you just added.
Or, if you use jQuery, you can simply use $.getJSON("NameOfTheScriptYouJustWrote") which returns an already parsed object.
Then use the returned data to update the list of users. And refresh it every 5-20 seconds.
And keep in mind that this is not an efficient way at all and that this will not work well if there are many clients using your service.
Greeting everybody,
My problem is like this: I have some custom made statistics on my internet site where I log informations on what users do ( google analytics - like ). Obviously I aggregate information a couple of months, but the Tables I store information have grown too large and have a negative impact on page loading. The flow is like this ( in index, so affect all pages ) :
1. Get the included files
2. Execute part of statistics queries
3. Effective page code
4. Execute the last part of statistics queries
To get rid of this problem I want to make those queries on <body onload="execQueries();"> or on document ready with javascript / AJAX.
How can I safely and securely make those queries using AJAX, so that cannot be abused by a client with good knowledge of javascript/ajax. Because if I simply make that JS function it can be accesed everytime by a user with firebug.
The solution I think about is including the use of $_SESSION where I mark in top of my index.php information about those queries ( id, info ) and in the script called by AJAX I check if that $_SESSION['query_info'] is set and I execute it reading all the info from there, and then I use unset($_SESSION['query_info']);. So, if the AJAX is called again, because tat specific $_SESSION['query_info'] does not exists, I do not do anything in my DB.
Do you think this is a secure solution or do you have other ideas? Anything viable is welcomed.
Thank you
Try putting your related javascript codes into Closures.
This is very hard to explain but I'm going to try.
We run a motor shop that has a QC program. The program was coded in access97 and it's time for an upgrade, we have elected to try a PHP/MySQL approach to do this.
Right now the access software has several pages to the form and each box sends to the database live so when you type something in you don't have to hit a save button or next or anything and when you come back it's there.
Also the forms are driven by an auto-incremented job number that you can punch into a field at the top of the page and it query's the server and displays all the data in the form boxes so you can edit it.
I don't know how to even start this project. I got a working form and an insert.php page but I don't know how to go about the rest.
If I could get a pointer in the right direction that would be appreciated. Thanks!
You just want it to save automatically? You'll have to look into JavaScript, and more specifically AJAX. I recommend using the jQuery library. Basically, you're going to want to make an AJAX call every time your form field is modified, and that AJAX call will simply update one field in particular.
I understand you are likely very new to website design, so this might be complicated for you.
I would read through this W3Schools tutorial. After reading through that, I'd pay close attention to this tutorial.
Again, this is difficult for beginners. I'd recommend you continue to work at your script, and ask more specific questions here on StackOverflow as time goes on. Good luck!
I have created a simple example here:
HTML/JS:
shaquin.tk/experiments/ajax.html,
PHP: shaquin.tk/experiments/qc.txt.
Have a look at the source to see how it works (I also have some comments in my code), feel free to copy it and modify for your own needs.
To sum up how it works:
When text is typed into a text box, a list of changed elements is updated.
Every updateInterval milliseconds (default 1000), the list is checked. (This helps reduce traffic and lag.) If anything has changed, the PHP file is called to update the database, and the list is cleared.
If an element loses focus and it has changed (e.g. copy/paste), the PHP file is called.
The PHP file sanitizes the query, checks for a valid job number, and updates the database.
References:
AJAX XMLHttpRequest
setInterval
addEventListener
encodeURI
mysqli_connect
mysqli_query
mysqli_real_escape_string
You'll need to submit the data as an ajax request. That way the data can be sent and returned without the page needing to be reloaded to update the information.
Good day all,
Basically what's I've got is a PHP based site tied to a MySQL database all on a local web server (nothing is being accessed from outside of the company). The index page displays an image resembling a bar chart. Employees of the company will be entering data periodically which will update the image that appears on the index page. The index page will be displayed on a couple different screens throughout the company and I need that index page to refresh after someone alters the data in the database.
I've been messing around with various AJAX solutions, but as I don't know much about AJAX I'm having trouble adapting something to work the way I need. Here's the way I've was thinking about:
-- on the index.php run a JavaScript function every minute or so that gets a response from dataChanged.php
-- dataChanged.php will query the database and get a timestamp from one of the tables.
-- the script on the index.php will then compare the timestamp to the last time the page was refreshed (or some variable that stores such information) and refresh if the data is new.
I'm somewhat proficient in PHP, but am very limited with JavaScript (and thus AJAX).
Can someone get me pointed in the right direction?
Thanks!
What you want it the standard javascript function setInterval
Have it execute a ajax call every now and then to get new data. Try to get a standars librabry that knows this stuff, probably jQuery
Depending on if the chart generation is time consuming or not I would go with different strategies. The preferred way would be to just generate the graph on each call, but if that is very time consuming I think your two-step solution works great.
Check out the link http://blog.codebusters.pl/en/auto-refresh-content-after-changes-in-database-ajax.
Here you can see how to refresh your page after change in db.
I am hitting a lot of different sites to get a list of information and I want to display this information as I get it. Right now I am using a Smarty Template and what I would like to do is:
Pseudo code:
{foreach $page}
$smarty_var = use AJAX to call a PHP function
Render out a new table row on the fly w/ the newly assigned var
<tr><td>{$smarty_var}</td></tr>
{/foreach}
I don't know much about AJAX, I used it a long time ago, and it was similar to this, but not quite, there was user action taken. No I don't have a JS Framework in place. Am I way off here on how this should go? Basically I want to display a table row as data comes available, each table row will be a request to get the data from another site.
Sure, I will tell you about what I am trying to do: http://bookscouter.com/prices.php?isbn=0132184745+&x=19&y=6
If you click on the 'Click to view prices from all 43 links' at the bottom on that page you will see. I am using cURL to get all the pages I want a price from. Then for each page I want to get the price. So each page is gonna fire off a function that runs some fun code like this:
function parseTBDOTpageNew($page, $isbn)
{
$first_cut = preg_split('/<table[^>]*>/', $page);
$second_cut = preg_split('/<td[^>]*>/', $first_cut[2]);
if(strstr($second_cut[4], "not currently buying this book") == true)
{
return "\$0.00";
}
$third_cut = preg_split('/<b[^>]*>/', $second_cut[9]);
$last_cut = preg_split('/</', $third_cut[3]);
return $last_cut[0];
}
This function is called from another function which puts the price returned from the function above, the name of the company, and a link in an array to be added to another bigger array that is sent to smarty. Instead of doing that, I just want to get the first array that is returned with individual information and add the values into a table row on the fly.
I will take your advice on Jquery, what I have started is an onload function that receives the $pages to be parsed, and I was just in the middle of writing: foreach page get the info and spit some html w/ the info on the page.
Also the function that calls the function to get the price is in a php file, so I need the request to hit a function within a php file and NOT just call file.php?param1=foo, I need to it to actually hit the function in the file. I have Jquery in place, now just trying to figure it out and get it to do what I need, ugh. I am searching, any help would be appreciated.
No I don't have a JS Framework in place
Fix that first. You don't want to juggle XMLHTTPRequests yourself. jQuery is SO's canonical JS library, and is pretty nifty.
Basically I want to display a table row as data comes available, each table row will be a request to get the data from another site.
How many rows will you be dealing with? Do they all have to be loaded asynchronously?
Let's tackle this in a braindead, straightforward way. Create a script that does nothing more than:
Take a site ID and fetch data from the corresponding URL
Render that data to some data transport format, either HTML or JSON.
Then it's a simple matter of making the page that the user gets, which will contain Javascript code that makes the ajax calls to the data fetcher, then either shoves the HTML in the page directly, or transforms the data into HTML and then shoves that into the page.
You'll note that at no point is Smarty really involved. ;)
This solution is highly impractical for anything more than a trivial number of sites to be polled asynchronously. If you need rows for dozens or hundreds of sites, that means each client is going to need to make dozens or hundreds of requests to your site for every single normal pageview. This is going to slaughter your server if more than one or two people load the page at once.
Can you tell us more about what you're doing, and what you're trying to accomplish? There are lots of ways to mitigate this problem, but they all depend on what you're doing.
Update for your question edit.
First, please consider using an actual HTML parser instead of regular expressions. The DOM is very powerful and you can target specific elements using XPath.
Instead of doing that, I just want to get the first array that is returned with individual information and add the values into a table row on the fly.
So, here's the ultimate problem. You want to do something asynchronously. PHP does not have a built-in generalized way to perform asynchronous tasks. There are a few ways to deal with this problem.
The first is as I've described above. Instead of doing any of the curl requests on page load, you farm the work out to the end user, and have the end user's browser make requests to your scraping script one by one, filling in the results.
The second is to use an asynchronous work queue, like Gearman. It has excellent PHP support via a PECL extension. You'd write one or more workers that can accept requests, and keep a pool of them running at all times. The larger the pool, the more things you can do at once. Once all of the data has returned, you can throw the complete set of data at your template engine, and call it good.
You can even combine the two, having the user make only one or two or three extra requests via ajax to fetch part of the returned data. Heck, you can even kick off the jobs in the background and return the page immediately, then request the results of the background jobs later via ajax.
Regardless of which way you handle it, you have a giant, huge problem. You're scraping someone's site. You may well be scraping someone's site very often. Not everyone is OK with this. You should seriously consider caching results aggressively, or even checking with each of the vendors to see if they have an API or data export that you can query against instead.