I am hitting a lot of different sites to get a list of information and I want to display this information as I get it. Right now I am using a Smarty Template and what I would like to do is:
Pseudo code:
{foreach $page}
$smarty_var = use AJAX to call a PHP function
Render out a new table row on the fly w/ the newly assigned var
<tr><td>{$smarty_var}</td></tr>
{/foreach}
I don't know much about AJAX, I used it a long time ago, and it was similar to this, but not quite, there was user action taken. No I don't have a JS Framework in place. Am I way off here on how this should go? Basically I want to display a table row as data comes available, each table row will be a request to get the data from another site.
Sure, I will tell you about what I am trying to do: http://bookscouter.com/prices.php?isbn=0132184745+&x=19&y=6
If you click on the 'Click to view prices from all 43 links' at the bottom on that page you will see. I am using cURL to get all the pages I want a price from. Then for each page I want to get the price. So each page is gonna fire off a function that runs some fun code like this:
function parseTBDOTpageNew($page, $isbn)
{
$first_cut = preg_split('/<table[^>]*>/', $page);
$second_cut = preg_split('/<td[^>]*>/', $first_cut[2]);
if(strstr($second_cut[4], "not currently buying this book") == true)
{
return "\$0.00";
}
$third_cut = preg_split('/<b[^>]*>/', $second_cut[9]);
$last_cut = preg_split('/</', $third_cut[3]);
return $last_cut[0];
}
This function is called from another function which puts the price returned from the function above, the name of the company, and a link in an array to be added to another bigger array that is sent to smarty. Instead of doing that, I just want to get the first array that is returned with individual information and add the values into a table row on the fly.
I will take your advice on Jquery, what I have started is an onload function that receives the $pages to be parsed, and I was just in the middle of writing: foreach page get the info and spit some html w/ the info on the page.
Also the function that calls the function to get the price is in a php file, so I need the request to hit a function within a php file and NOT just call file.php?param1=foo, I need to it to actually hit the function in the file. I have Jquery in place, now just trying to figure it out and get it to do what I need, ugh. I am searching, any help would be appreciated.
No I don't have a JS Framework in place
Fix that first. You don't want to juggle XMLHTTPRequests yourself. jQuery is SO's canonical JS library, and is pretty nifty.
Basically I want to display a table row as data comes available, each table row will be a request to get the data from another site.
How many rows will you be dealing with? Do they all have to be loaded asynchronously?
Let's tackle this in a braindead, straightforward way. Create a script that does nothing more than:
Take a site ID and fetch data from the corresponding URL
Render that data to some data transport format, either HTML or JSON.
Then it's a simple matter of making the page that the user gets, which will contain Javascript code that makes the ajax calls to the data fetcher, then either shoves the HTML in the page directly, or transforms the data into HTML and then shoves that into the page.
You'll note that at no point is Smarty really involved. ;)
This solution is highly impractical for anything more than a trivial number of sites to be polled asynchronously. If you need rows for dozens or hundreds of sites, that means each client is going to need to make dozens or hundreds of requests to your site for every single normal pageview. This is going to slaughter your server if more than one or two people load the page at once.
Can you tell us more about what you're doing, and what you're trying to accomplish? There are lots of ways to mitigate this problem, but they all depend on what you're doing.
Update for your question edit.
First, please consider using an actual HTML parser instead of regular expressions. The DOM is very powerful and you can target specific elements using XPath.
Instead of doing that, I just want to get the first array that is returned with individual information and add the values into a table row on the fly.
So, here's the ultimate problem. You want to do something asynchronously. PHP does not have a built-in generalized way to perform asynchronous tasks. There are a few ways to deal with this problem.
The first is as I've described above. Instead of doing any of the curl requests on page load, you farm the work out to the end user, and have the end user's browser make requests to your scraping script one by one, filling in the results.
The second is to use an asynchronous work queue, like Gearman. It has excellent PHP support via a PECL extension. You'd write one or more workers that can accept requests, and keep a pool of them running at all times. The larger the pool, the more things you can do at once. Once all of the data has returned, you can throw the complete set of data at your template engine, and call it good.
You can even combine the two, having the user make only one or two or three extra requests via ajax to fetch part of the returned data. Heck, you can even kick off the jobs in the background and return the page immediately, then request the results of the background jobs later via ajax.
Regardless of which way you handle it, you have a giant, huge problem. You're scraping someone's site. You may well be scraping someone's site very often. Not everyone is OK with this. You should seriously consider caching results aggressively, or even checking with each of the vendors to see if they have an API or data export that you can query against instead.
Related
I've just started learning PHP and just done with $_POST/$_GET.
Now I want to know, what is the pro's and con's of having the PHP to process the data from a form inside the same file, or send the data to another file (action="anotherfile")?
Logically I will think that sending it to another file would increase the time process it, but is that true?
When I have the PHP script inside the same file, the page doesnt seem to reload when I hit the submit button (but the content changes). Or does it? If it does, wouldn't the only difference would be that I would have to type the script for the menu (lets say you have the same menu on all pages) in both files? Which would lead to more coding/less space?
what is the pro's and con's of having the PHP to process the data from a form inside the same file, or send the data to another file (action="anotherfile")?
You are conflating files and urls.
By having the logic split between different files (and then included where appropriate) you seperate concerns and make your code easier to manage.
By having a single URL be responsible for both displaying the form and processing the form data you don't end up in the awkward situation where the result of processing the form data requires that you redisplay the form with error messages in it. If you used two different URLs there you would need to either display the form on the processing URL (so you have two different URLs which display the form) or perform an HTTP redirect back to the original URL while somehow passing details of the errors to it.
Logically I will think that sending it to another file would increase the time process it, but is that true?
No. It makes no difference on the time scales being dealt with.
When I have the PHP script inside the same file, the page doesnt seem to reload when I hit the submit button (but the content changes).
It does reload.
If it does, wouldn't the only difference would be that I would have to type the script for the menu (lets say you have the same menu on all pages) in both files?
That's what includes are for.
In any language we always try to write clean code. That's why we follow MVC.
Logically I will think that sending it to another file would increase the time process it, but is that true? I think NO.
Because when we send data to another page and on another page at the top we echo that post data and exit. you will see it will not take time. it take time when we redirect/load some html page after that.
It does not matter where we sending data (same page or another page). matter is what is loading after that.
There is no difference about speed.
Whetever you post the content of your form in standard submit, this data will be sent to the server and a response (after processing ) will be downloaded.
The only difference is about organization of your code. The logic that draws themplate of page (menu or other fixed parts) should be stored in some file that you can include separately or call by a function.
Is also true that when you post your data you do for some reason, register a user for example. Is a good pratice that the php file that handles user registration will do that and output the messages and not other functions.
If your file has some logic switches that make it output either an empty form or a a registration message based on the presence of post or get variables, you will notice that when you scale to more complex tasks this will add complexity and make code mantainment harder.
I'll try to make sure I understand your question by restating it.
If you have a form (/form.php), and the "action" of that submit button leads you to a separate php page (/form_action.php), there is absolutely no difference in speed. Each HTTP request (form.php and form_action.php) is independent - "form_action.php" doesn't remember anything about "form.php" unless you pass that information through (as parameters). This is what people mean when they say that HTTP is stateless. It's worth learning about how HTTP works in general alongside the details of PHP.
If you have a PHP script which in turn includes other PHP scripts, there is a tiny performance impact - too small to measure in pretty much any case I've ever come across.
However, using includes allows you to separate your markup (the HTML) from the logic (the PHP). This is a really good thing if you are doing anything other than tinkering. It allows you to re-use functionality, it makes it much easier to change and maintain the code over time, and it helps you think through what you're trying to achieve.
There are many different ways people have solved the "how do I keep my code clean" puzzle; the current orthodoxy is "Model-View-Controller" (as #monty says). There are also PHP frameworks which make this a little easier to implement - once you've got the basics of the language, you might want to look at Zend or TinyMVC (there are several others, each with their benefits and drawbacks).
I have a PHP page with a simple form. One input text field & a button. Input text field accepts user queries & on button click an HTTP GET request is made to the server & the result has to be shown back in the same page containing the form. That's too simple to do. I can do this in two ways. One is AJAX & other one is the good old sodding form-submit method.
My question is simple- Which method should I use? Since both of the roads lead us to the same place, which one should I choose to travel?
First of all, let me talk about form submit method. I can use <?php echo $_SERVER['PHP_SELF'] ; ?> as the action of the form for submitting the values of my form to the same page. Once I store those values into some random variables, I can make a GET request & obtain the result & show it to the world. This method is easy to use. Happy Down Voting to all of you.
Or I can make a GET request using AJAX and jQuery or JavaScript or whatever you wish to use & obtain the same result as in the previous case. Output is same. Only the mode of execution is different.
So which one is better? Which one fetches result faster? And why should I use that? Is there any difference? GET, POST, PUT or whatever- it doesn't really matter. AJAX or form-submit?
There shouldn't be any significant, genuine speed difference between them.
The Ajax approach will load a smaller amount of data (since you aren't loading an entire HTML document), but by the time you take into account HTTP compression and the fact that (if your system is sensibly configured) your dependancies (images, scripts, stylesheets, etc) will be cached, it won't be significantly smaller.
By using JavaScript to create a loading indicator and not refreshing the entire window in front of the user, you can create the illusion of a faster load time though. So if feeling faster was the only concern, then Ajax is the way forward.
Using JavaScript, however, is more complicated and slightly more prone to failure. The consequences of failure states are more severe because, unless your code detects and does something with them, the user will (not) see it fail silently. For example, if a normal page load times out because the user is on a train and went through a tunnel, they'll see an error page provided by their browser suggesting that they refresh and try again. With Ajax, you need to write the error handling code yourself. This does give you more flexibility (such as allowing you to simply try again a few times) but the work isn't done for you.
The other consequence of using Ajax is that the address bar will not update automatically. This means that the results won't be bookmarkable or sharable unless you do something explicit the make that possible. The usual way to do that is pushState and friends, but again, it is more work.
You should also make the site work without JavaScript so that if the JS doesn't run for any reason then the site won't break completely. If you use pushState then you have to do this for the URLs you are setting the address bar to point to to be useful.
The short answer: Use a regular form submission, then consider layering JavaScript over the top if you think it will give your visitors a worthwhile benefit.
I Should stick to an Ajax request when possible.
This because you then don't really have to load every single item on the page again ( like all the images, menu and so on ). You can just give the relevant HTML back and JQuery can place it inside the relevant holder.
But that is just my humble opinion...
If you have to retrive simple data from server without reload the page my advice is use jquery .get o .post
also it provides you a very large API that allows you to reduce your programming time.
http://api.jquery.com/
obviously the execution time increase but in my experience the user cant fell the differce with a simple ajax request.
so in my opinion if jquery allow you to obtain the results, this is the best solution because halves your work time!
See the edited one it may help you.
I think that AJAX should be used for displays updates and form submissions should be done via a page reload. Reasoning?
When submitting forms, you are telling the application to do something. Users tend to want to feel that it was done. When a page doesn't reload, users are often left wondering "Did that work?". Then they have to check to make sure what they did was right.
but when you are displaying a table or something, and the user says to "display x data....now x1 data" for instance, they aren't "doing" something (creating new entities, sending emails, etc). So AJAX can provide a nice user interface in this case. Page reloads would be annoying here.
In conclusion, I think form submission should be done via page reloads (let the user see it working), whereas display updates should use AJAX (prevent annoying page reloads).
Of course, this is a preference thing. Some of my company's applications use AJAX all over. But those are the applications that are the most difficult to maintain and debug. ;)``
Ok what I am trying to make is a system that supports tickets. Tickets have all kinds of info on a specific job. How can I make a ticket menu with like links to tickets which contain all the info I need. For instance I click on ticket number 777 so it I have the php?id=777 in the url.
I need this page to constantly look for new tickets.
what you need to do is add a the correct ticket to an anchor
Ticket 777
Ticket 778
on the page.php you can access the ticket number using the $_GET variable
<?php
$ticket = $_GET['id'];
/// now you have the ticket number in the variable
?>
I need this page to constantly look for new tickets.
I'm not 100% sure what you're exactly looking for here, but there are two basic solutions here:
AJAX to update the content without reloading the page
Reload the page every few minutes and have PHP handle it
The first is better for things that are constantly updated, like Twitter posts, news, stock-tickers, etc, but it's a little overkill for something that updates relatively rarely (takes longer than 6 minutes or so).
The tickets should be stored in some kind of database and then read out and looped over to create the table. This would probably make more sense in a list format (stacked divs) instead of a table, but then again, I don't know the specifics.
For the simple PHP generation of links, #Ibu has a good example.
EDIT:
For more information about implementing AJAX, this page has a good example. I would recommend using a framework like jQuery or MooTools to handle the AJAX because there are some inconsistencies between browsers.
EDIT:
From the comments you made, it sounds like you are not very familiar with how PHP works.
PHP is just a templating language with some programming language features. It is best used to generate pages on the fly with dynamic content from a database.
When you try to request a .php page, you are actually telling the server to execute the code in that file. When all the code is finished, the resulting document is given to the requesting browser. The result should be valid HTML if done correctly.
I'm having to interact with the Facebook API for this project, which I find to be actually a bit slower than I expected. Because of this, I'm having to do something which I find rather unorthodox: I need to load the content Facebook provides back in my PHP script AS IT LOADS from Facebook. Traditionally I've loaded content into a div tag at the success of the script; however, I need to load the content as it appears. It would be absolutely unacceptable to have a client wait nearly a minute for Facebook to load an album and all respective comments before displaying anything. Hopefully I'm not being to vague; I'm not here to ask for code, but I've tried just about everything I can think of. Is this a simple concept I'm missing?... I feel as though this is easier than I'm making it.
I'm using jQuery AJAX as I find this easiest to work with. Any comments and/or help would be greatly appreciated.
The root of your problem is that jQuery's AJAX methods hook into the onreadystatechange event and readyState variable. readyState is only set to 4 when the file is completely transferred, and therefore your events will only fire after the download is complete.
Accessing the data as it is being sent is not consistent across different browser families. Doing it this way is going to be incredibly complex and time-consuming. I would recommend first doing this a little differently, perhaps by preloading the relevant facebook data on your own server predictively. This can be compiled to a static page, and that can in turn be served to your users very quickly.
To get the data to your users faster, you'll need to work outside the box as well. There's a jQuery plugin discussed here ( Does PHP flush work with jQuerys ajax? ) that makes jQuery ajax methods compatible with streamed output. Good luck.
The problem seems to stem from the fact that you're getting too much data at once. I suppose you are talking about receiving content from ajax as it is printed out immediately, but it is possible this content is built and sent at once and you won't have access to the data until the entire parse is complete. If this is untrue, look into COMET. If it is true, the solution is to put a limit on how much data you retrieve at once in an effort to reduce the parse time. For example, retrieve 5 photos in each request. Add those 5 photos to the DOM while you retrieve the next 5.
Instead of putting whatever code you want inside of the callback, just put it after the callback
for example
$("#div").load("facebook...", function() {
//do stuff
});
//put the stuff you want to load at the same time, here
I have a web app where I need to change a drop down list dynamically depending on another drop down list.
I have two options:
Get all the data beforehand with PHP and "manage" it later with Javascript.
Or get the data the user wants through AJAX.
The thing is, that the page loads with all the data by default and the user can later select a sub category to narrow the drop downs.
Which of the two options are better (faster, less resource intensive)?
The less resource intensive option is clearly AJAX since you're only transferring the required information and no more.
However, AJAX can make the page less responsive if the latency is high for the client (having to wait for connections to fetch data between drop-down choices).
So: Load everything up front if latency is a bigger issue, and use AJAX if bandwidth is more of an issue.
It depends of your main goal:
1.
with ajax you'll be able to get the data you want without page refresh, and getting it as needed, thus you app will run faster...
It will also allow you to have a single block of code on an independent file to be "called by ajax" when needed, thus ussing that code across your app without loading it constantly!
2.
With php, you will have to prep the data beforehand thus writing a little more code, thus making your app slower...
Performance is nothing a user will see, unless we are talking about a big amount of data.
Concluding, ajax is the best way when talking about performance and code effectiveness!
Ps: Personal opinion of course!
If there is a considerable number of possible select options, I would use AJAX to get them dynamically. If you only have a very small set of select options, it would be worth considering embedding them in the page. Embedding in the page means no latency, and a snappier interface.
However, as stated previously, dynamic retrievals are very useful if you have a large set of options, or if the options are subject to changing dynamically.
As with any ajax request, remember to display some form of visual feedback while the request is underway.