i try to make a "status monitor" for our small network. After the page was load i make a ping for every IP which i addedd. Its, ok. But i would like to do this ping in every X minute, without reload my hole page.
I can make it if i reload the page with header refresh, but i would like to do this witout reload.
I think i have to do this with AJAX?, But i dont know how..
Thank you
I would strongly suggest you have a look at Nagios or something similar:
1) you don't need to have a web page constantly open to detect problems
2) it can automatically verify and escalate issues
3) there are lots of probes available out of the box which can be used to measure all sorts of things - not just ping times
4) responding to a ping is not the same thing as working
5) it automatically collates stats to identify patterns of issues
6) it also provides SLA type reporting
7) Nagios is simple enough that even I can understand it
8) its what I chose after a lot of work researching a replacement for a system similar to you are suggesting.
HTH
C.
If it is entire code of page i suggest setting up a cron job
and if you want to use ajax ( ie jquery ajax there is a plugin called jquery timer) use it send a ajax request to the page with code you want to run.
http://plugins.jquery.com/project/timers
check this out
I suggest you take a look at some of the "other-way-around" approaches, such as COMET, here is an interesting article covering basic usage with PHP.
This would put the implementation of "ping" in your server instead of the client.
You could for instance instead of setting a fixed interval push out updates at will. Meaning you would get almost realtime status notifications instead of the fixed interval updates.
In web development, Comet is a
neologism to describe a web
application model in which a long-held
HTTP request allows a web server to
push data to a browser, without the
browser explicitly requesting it.
Comet is an umbrella term for multiple
techniques for achieving this
interaction. All these methods rely on
features included by default in
browsers, such as JavaScript, rather
than on non-default plugins.
COMET (Wikipedia)
Why don't you try a cron?
I'm not sure exactly what you want to do here, but this quick tutorial shows you how to call a php file every second and update a dib block with the results. It is quick and simple using jquery.
Related
I have made a dating website where I have use one to one chatting application like facebook. When one user send any message to another user it showing into their popup chat box, but I have done this using ajax. Which I have run in every interval using javascript setInterval function. But I think the process is not optimize one. I don't want to make unnecessary request to the server each time, rather it only trigger when there is some new message for that user. Is there any other way to do it or any other protocol which using by big site like facebook, gmail?
You could do this using WebSockets, but that requires both a server implementation and a web browser that supports it.
Another technique is to use Long Polling, but again, this requires work on both the client and the server. The advantage is that this is a cross browser compatible technique.
I agree with Josh that WebSockets would be worth looking into, however if you don't have access to the server you could use something like Firebase for the back end.
https://www.firebase.com/index.html
Read into Long Polling. It's what facebook uses. Basically your client makes one Ajax call and nothing gets returned until there is data to push to it. I'm pretty sure it requires some custom server configuration so if you're developing on shared hosting it isn't going to cut it. Long Polling would be the right, albeit, more complicated way of doing this if efficiency is what you want.
Server-Sent Events seems to be another option.
A chat example: http://motyar.blogspot.com.es/2012/01/simple-chat-application-with-html5.html
Documentation: https://developer.mozilla.org/en-US/docs/Server-sent_events
I'm building a prototype where on one page A I have a controller. Imagine a TV remote. On another page B, independent of form A -- it can be a different screen/computer -- with some elements X, Y, and Z that are going to be animated by the remote on page A.
My first idea would be:
Remote on page A saves an action wanted and sends JSON.
I create a listener on page B that reads the JSON to listen to what action to trigger.
I wonder if this the right direction.
It is just for a prototype so it doesn't have to be a production-perfect idea.
You can use web sockets for this.
Let's assume you're using 2 computers, both pointed at the website, as this technique could work in both scenarios.
If it's just a prototype, you could just simply have your page B poll the server every 5 seconds to look for updates that were submitted by page A.
However, in reality, for a production app with thousands of users, this could consume lots of bandwidth and put a heavy load on your server. To compensate for the load and bandwidth usage, you could increase the polling rate to 10 seconds, or 30 seconds, but in response to this change your users would experience delays while they wait for the browser to request an update from the server.
In a production app, many developers are turning to Comet as a solution. Comet is basically a term given by Alex Russell for a technique that involves using the request/response cycle to simulate server-push.
The general idea is that the browser makes a request to the server, and the server holds that connection open indefinitely. The connection remains open until another user posts an update to the server, in which case, the server then sends a response to the connected users.
The Dojo and Jetty team have demonstrations where they show that this technique can scale to 20,000 connections, when using continuations.
While I think you can carry out your experiment just fine with a database and/or some session variables, if your want to learn more about Comet on PHP, check out How to Implement Comet with PHP. Good luck!
UPDATE:
I also wanted to say that you definitely have the right idea for how to conceptually think about your message passing with JSON:
I create a listener on page B that reads the JSON to listen to what action to trigger.
I really like how you are thinking about passing a message that then tells the page what action to trigger. If you think about it, you could reuse your message passing concept to call other commands so that you avoid reinventing the wheel when a new command comes along that you need to call. Regardless of whether you poll, use Comet, or use WebSockets, it's a great idea to think about abstractions and generic, reusable data transports.
You could do this either with polling (having page B constantly poll for updates from the server) or use a server push technology like server sent events or websockets.
Yes, that would work. You could also just make it the same way you would make a vector line animation. Send the "commands" for movement to a server and record them (in a database, file, whatever) the client program can then request and redraw the movement smoothly any time and anywhere.
Using a cron job to execute Page B for every x unit time will make you check for any latest updated json (queried/returned output according your logic) from Page A. This way, you can use new updated json from page A and do your further task...
I've done web scraping before but it was never this complex. I want to grab course information from a school website. However all the course information is displayed in a web scraper's nightmare.
First off, when you click the "Schedule of Classes" url, it directs you through several other pages first (I believe to set cookies and check other crap).
Then it finally loads a page with an iframe that apparently only likes to load when it's loaded from within the institution's webpage (ie arizona.edu).
From there the form submissions have to be made via buttons that don't actually reload the page but merely submit a AJAX query and I think it just manipulates the iframe.
This query is particularly hard for me to replicate. I've been using PHP and curl to simulate a browser visiting the initial page, gather's the proper cookies and such. But I think I have a problem with the headers that my curl function is sending because it never lets me execute any sort of query after the initial "search form" loads.
Any help would be awesome...
http://www.arizona.edu/students/registering-classes -> "Schedule of Classes"
Or just here:
http://schedule.arizona.edu/
If you need to scrape a site with heavy JS / AJAX usage - you need something more powerful than php ;)
First - it must be full browser with capability to execute JS, and second - there must be some api for auto-browsing.
Assuming that you are a kid (who else would need to parse a school) - try Firefox with iMacros. If you are more seasoned veteran - look towards Selenium.
I used to scrap a lot of pages with JS, iframes and all kinds of that stuff. I used PhantomJS as a headless browser, that later I wrapped with PhantomCurl wrapper. The wrapper is a python script that can be run from command line or imported as a module
Are you sure you are allowed to scrape the site?
If yes, then they could just give you a simple REST api?
In rare case when they would allow you to get to the data, but would not provide API, my advice would be to install some software to record your HTTP interaction with web site, maybe wireshark, or some HTTP proxy, but it is important that you get all details of http requests recorded. After you have that, analyze it, and try to replay it up to the latest bit.
Among possible chores, it might be that at some point in time server sends you generated javascript, that needs to be executed by the client browser in order to get to the next step. In this case you would need to figure how to parse received javascript, and figure out how to move next.
An also good idea would be not to fire all your http requests in burst mode, put put some random delays so that it appears to the server more "human" like.
But in the end you need to figure out if all this is worth the trouble? Since almost any road block to scraping can be worked around, but it can get quite involved and time consuming.
Any idea how to implement this (http://fluin.com/63) using MySQL+PHP+Javascript(mootools)?
In a nutshell, it's a realtime threaded conversational web app.
Update:
This uses http://www.ape-project.org/home.html
Any idea how to implement realtime stuff without AJAX push (ape)?
Install Firefox.
Install Web Development toolbar
Install Firebug
Install HttpFox
Read docs of above tools re how to use, what they can do.
Go to http://fluin.com/63. Use above tools to inspect.
Read up on Databases and data models, and MySQL.
Build your own.
Well, this depends on your definition of realtime, which, in its technical meaning, is simply impossible with public ip networks and traditional tcp stack, for you have no control over timing.
Closer to the topic though, to get any web page updated without direct user intervention, you'd have to use javascript to poll server for changes since the last successful poll, and do this over certain intervals of time. In calculating these intervals you'll have to consider both network/server load, and the delay that is comfortable for the user.
The server, of course, will have to store the new data and its timely status (creation timestamps are one way of doing it), to be able to distinguish between content already delivered to various clients.
As soon as the server reports new content, it is inserted into a dom page via javascript and the user sees the response.
This is a bit general, of course, but you should get the idea.
Isn't it like a shoutbox ? here an example of one
Doing this properly using PHP only is very hard. When you have 5 users you could use long-polling, but it will definitely not scale when you have let's say 1000 users.
Using comet with PHP?
The screencast(link) in my post shows how you could implement it, but it has a couple of flaws:
It touches the disc(disc is very slow compared to memory).
To make matters worse it also polls the disc frequently(filemtime()).
Maybe phet(PHP) is able to scale. You should try that out.
To make it scale I think you need at least:
a good implementation of long-polling(at least long-polling. You have better transports) that can handle load.
keep data in memory(much faster than dics) using something like redis or memcached.
I would use:
node.js with socket.io(video) module.
to keep data in memory I would use node_redis(video).
I have been working on a site that makes some pretty big use of AJAX and dynamic JavaScript on the front end and it's time to start stress testing. But how do you properly stress test something that requires clicking several links on the front-end? One way I was able to easily hit every page of the site quickly and repeatedly was to point a Google Mini at it. But that's not going to click links and then navigate Modal windows and things like that.
Edit - I should point out that the site is done in PHP5 and the JavaScript library used is jQuery. Not sure if this would make any difference but felt it might be useful to know.
JMeter is great at this. You may record your sessions and tweak them to your liking.
So-called 'ajax load testing' is a recurring subject on this site, and is often confused. So let's get it straight: There is really no difference between load testing a normal web page and load testing with ajax. It all boils down to discrete requests; they just happen to not be full page refreshes.
One thing to keep in mind is there is a distinct difference between load testing the server processing the requests (a load test) and the performance on screen of the UI components being updated (how well your javascript performs.)
Simple load test example:
initial page load
login
navigate?
5-10 'ajax' requests (or whatever may fit your application usage pattern)
logout
There are load testing tools that can support AJAX. For example, WebLoad
http://www.radview.com/solutions/ajax-load-testing.aspx
What you really want is to stress test is the server's ability to handle the ajax requests. Use a load tool that looks at the requests while "recording" the test, and then tune as appropriate. I have only used the vs test edition one, so I can't point you to a low cost one.
I disagree with Nathan and Freddy to some degree. They are correct that "AJAX testing" is really no different in that HTTP requests are made. But it's not that simple. See my article on Ajaxian.com on Why Load Testing Ajax is Hard.
JMeter, Pylot, and The Grinder are all great tools for generating HTTP requests (I personally recommend Pylot). But at their core, they don't act as a browser and process JavaScript, meaning all they do is replay the traffic they saw at record time. If those AJAX requests were unique to that session, they may not be suitable/correct to replay in large volumes.
The fact is that as more logic is pushed down in to the browser, it becomes much more difficult (if not impossible) to properly simulate the traffic using traditional load testing tools.
In my article, I give a simple example of how difficult it becomes to test something like Google's home page when you want to query 1000's of different search terms (an important goal during load testing). To do it with JMeter/Pylot/Grinder you effectively end up re-writing parts of the AJAX code (in your case w/ jQuery) over again in the native language of the tool.
It gets even more complex if your goal is to measure the response time as perceived by the user (which is arguably the most important thing at the end of the day). For really complex applications that use Comet/"Reverse Ajax" (a technique that keeps open sockets for long periods of time), traditional load tools don't work at all.
My company, BrowserMob, provides a load testing service that uses Firefox browsers, powered by Selenium, to drive hundreds or thousands of real browsers, allowing you to measure and time the performance of visual elements as seen in the browser. We also support traditional virtual users (blind HTTP traffic) and a simulated browser (via HtmlUnit).
All that said, usually a mix of a service like BrowserMob plus traditional load testing is the right approach. That is, real browsers are great for a full-fidelity load test, but they will never be as economical as "virtual users", since they require 10-100X more RAM and CPU. See my recent blog post on whether to simulate or not to simulate virtual users.
Hope that helps!
You could use something like openSTA.
This allows a session with a web site to be recorded and then played back via a relatively simple script language.
You can also easily test web services and write your own scripts.
It allows you to put scripts together in a test in any way you want and configure the number of iterations, the number of users in each iteration, the ramp up time to introduce each new user and the delay between each iteration. Tests can also be scheduled in the future.
It's open source and free.
It produces a number of reports which can be saved to a spreadsheet. We then use a pivot table to easily analyse and graph the results.