I am having to switch my site over from ColdFusion to PHP and I am noticing that some JS, fadeslideshow.js, isn't running to smoothly. First I notice that some images aren't being centered as they should and the code seems to run a little choppy. Any clues for this novice?
Php and Cold Fusion are both server-side code. JavaScript is client side. So unless your JavaScript is waiting for an ajax request to load, Php/ColdFusion have absolutely no impact on your JavaScript's execution speed. Perhaps you're testing with a different browser than the one you were previously using? That would explain the changes.
You might want to fix the following
Related
I am relatively new to node.js and socket.io. Currently I have a half finished private web project, which runs only with PHP with a MySQL database on the server side. I decided to bring it to a more advanced level using socket.io, for several features within the project.
So I read a lot about it and watched a whole bunch of tutorials. Also I found this and this during my research.
My question is, if that is still the common way to develop a web application?
More exactly: to use on one event (like a form submit) both an AJAX request and a socket.emit, for those events it is necessary/wanted.
The background of this thought is the following. I have a whole bunch of calculations running now in PHP. And the node.js server runs logically in JavaScript. So I can easily implement a node.js server without changing anything on my AJAX requests. Or rewrite everything I have so far, to js and use only a node.js server.
But this leads to 3 more questions:
Which runs possibly faster on the server side. A calculation scripted with PHP or JavaScript?
How to use transactions on a node.js server while using MySQL?
And how great is the influence by converting a PHP array to a JSON object, what you could avoid with the usage of just the node.js server where you don't need to convert anything.
JavaScript is executed on the client side so you are limited by the user's hardware whereas PHP is executed on your server. See this post for more info about performance comparaison.
I highly suggest you take a look at this pure node.js client that will perfectly do the job in your case.
PHP has many functions to use on JSON data (json_decode(), json_encode(), ...) but Node.js don't require JSON data to be converted. In the end, it really depend on your usage and how you plan to store and use that data
I know that in PHP, once you make a request, browser will go in a waiting mode until PHP does it's operation on the server and send the results back. In the meantime, there is no live connection between the browser and the Server while PHP is doing it's work.
I was wondering if same goes with JSP, or connections are handled differently?
First of all JSP is not a programming language. It's just a technology.
I guess that your question goes more into how they work right (Java and PHP)? So to put you in a simple prespective and with a simple answer, take into consideration that both PHP and Java are scripting languages. The main difference is that Java is compiled, before going to the server, to bytecode and then that bytecode is interpreted by JVM when the request is made. On PHP the code is interpreted (translated to bytecode) on the run and executed by the server. On both cases an answer to you request is given back to browser.
On the middle in between your request and the answer, there's JSP which is a technology that makes it possible for you to write HTML with some dynamic scripting in the middle. Much like a template to something (you can also compare it to a templating engine on PHP like twig). You write it only on your views and it gets compiled by getting parsed to Java Servlets.
I hope my answer makes you investigate a little bit because it has a
I've been working on sockets, generally in PHP for a while. Currently I have a PHP client for connecting to a chat server, and output every each data sent from server it's connected to.
To explain that in a wider matter, I accomplished this using flush() function in PHP to write out every each buffer waiting in the loop. Buffer reader is withing a while where the condition is the status of the connection socket. But this matters less.
Now to what I want to accomplish. I want to keep socket handling to server side and data from server outputted to client, via AJAX/jQuery. So far, my researches always returned me HTML5 WebSocket and node.js, however, I "have to" be real picky about this, as for users of this, my minimal dependency might be:
WinXP IE6 users(Already disables jQuery, even)
Users without JAVA/Flash installed
So I have to think of possibilities in this, which is why I can't use a Flash/Java backend or a new technology like WebSockets, and neither I want to handle server stuff in the client. I really hate to be stuck in old technology but for this it's a must.
As I was searching around, I found this one being as similiar to my needs.
Is PHP socket a viable option for making PHP jQuery based chat?
And to quick review the answers, they all point to one direction, PHP multi-process and memory eating. I know this is a minus, but it's the best I can take for now. But yet still, there'll be timeout disconnects for inactive connections within a certain delay, and extension of the delay if wanted. So I'm not much onto this one.
Secondly, the last answer pointing to "Ajax Chat Application Tutorial", I made an overall review but whoa, writing each line into an html file and re-including it each time, that is which I could do without using an extra file but, is it really necessary? Plus re-reading the file from server side, and re-importing the whole read file into document every each time, isn't that just worse for "both sides"?
Either ways that's about it, I wasn't able to come to a conclusion for a while, and it happened, here I am again. (:P) Waiting for your answers/suggestions/ideas, thanks by now.
Regards.
There is server software available that specializes in such matters. Is called a push server/service. There's for example APE (http://www.ape-project.org/); according to their website, it's compatible with all web browsers and they even got a demo chat there. I'd suggest you to go for that solution.
Recently, I've been involved in two projects.
The first one is built on php and the second in Javascript (using http://nodejs.org/).
Well, I thought that since php depends mostly on the hosting provider, if the site works in one browser it should work in all the rest. Since Javasscript depends mostly on the browser I should encounter more issues in different browsers.
What about the Javascript part that handle the things between the client and server side?
Am I right? Or I'm not considering something?
Server Side javascript does not runs in browser. So it wont have cross browser issues.
If you're using server-side Javascript, it doesn't matter what browser the client is using. From the browser's perspective, there could just as well be a team of well-trained monkeys keying HTTP responses into a teletype somewhere -- all that matters to them is that you're returning data. It doesn't matter to the client what you're using.
Browser issues are in essence completely independent of PHP, period. After all, at the end of the day when the script finishes then all you get is HTML output.
We need to create a web-based frontend for displaying some data. The problem is that the data needs to be updated about once a second.
For me as a web-developer the obvious solution is AJAX.
Unfortunately, one of the purposes of this web frontend is to be displayed inside of embedded browser window which is expected to run constantly for months or even years. That's it, months of work with no restart / refresh.
During testing we ran a proof of concept interface (which requested a simple set of data each 1,5s) in Safari for over a month. During this period of time, the memory usage of Safari raised from ~30 MB to over 100MB.
Thus we're afraid of stability of such a solution.
I'm wondering if you could recommend us any other technique for this task, possibly with less overhead (when requesting simple sets of data - as in our case - I'm afraid the HTTP headers are very significant part of data)
I would suggest looking into node.js and the now.js plugging, which allows for realtime updates via websockets. It even has support for older browsers, so if the browser does not support websockets, it will do a fall over to either a comet server implementation, AJAX or an iframe.
It's extremely easy to setup on a linux environment, and there's ample documentation to get you started.
It works with javascript and runs on the Google V8 javascript engine, so if you've ever worked with OOP Javascript, you should be able to pick it up relatively easy.
LINKS:
http://nodejs.org/
http://nowjs.com/
How about Adobe AIR as front-end? You can use Flash/FLEX inside which have decent garbage collectors so long running shoudn't be a problem. AIR also allows to write in XHTML and JavaScript so it could be a good option if you're only familiar with those technologies
PHP is not a good choice for this kind of requests. Comet seems to be a good way to receive data from server. You can use for example excellent Tornado (Python) as backend.
ActionScript allows to use TCP sockets so you can write your own protocol for even better performance and use BOOST Asio (C++) or Netty (Java) as scalable backend
Maybe websocket ? Instead of making an AJAX request each X seconds, the server push new data as they comes.
My personal faverite is php4+, mysql, apache or lightpd webserver.
Tough I also suggest Python.
I specialize in what you are mentioning, with that said, will you be actually looking at the screen? If not you should request the page using an http socket or via a wget cronjob on a linux box.
Yes the http header is very important, if you try to strip them out the webserver will issue a "Server - Bad Request" Error.
Let me know what you decide, I have a lot to share :)
I suspect that the problem is not AJAX per se, but using a browser an sich: I don't think any where made with constant running in mind, and I'm assuming that all (re)loading processes will become some form of extra memory in the end.
I think you would be best off to consume your data trough something simple you design yourself. You can obviously produce it on the same spot (server, requestable via HTTP or whatever you like most), but you do not need a complete webbrowser if your goal is first "a couple of years uptime".