I am currently working on a site. I designed it as an API so that it can easily interface with mobile devices as well as keep me completely separate from front end development. The intention was that the front end designer would use javascript\jQuery to make API calls. The API returns JSON so the front end designer would format the content appropriately. I noticed that instead of using jQuery to obtain this data he is using inline PHP to make the appropriate API calls using cURL to localhost and then echoing the JSON result and formatting it. Is this cause for concern since the server is essentially requesting itself. A new process is spawned, the server has to process a request AND response, etc. Is it better off for the remote clients to use jQuery to resolve the API calls or have the server cURL localhost and resolve them?
It sounds like the performance issue here would be that PHP might be getting loaded up more than it needs to:
JQuery -> RESTful API built on PHP
vs
JQUERY -> PHP cURL Call -> cURL -> RESTful API built on PHP
Each call takes an extra use of PHP, as you said spawning another process. The extra use of cURL is no biggie (it's lightweight), but the extra use of PHP might be an issue if you are going to have heavy usage (say 100 concurrent, but really depends on your server, and many other factors).
Related
I am relatively new to node.js and socket.io. Currently I have a half finished private web project, which runs only with PHP with a MySQL database on the server side. I decided to bring it to a more advanced level using socket.io, for several features within the project.
So I read a lot about it and watched a whole bunch of tutorials. Also I found this and this during my research.
My question is, if that is still the common way to develop a web application?
More exactly: to use on one event (like a form submit) both an AJAX request and a socket.emit, for those events it is necessary/wanted.
The background of this thought is the following. I have a whole bunch of calculations running now in PHP. And the node.js server runs logically in JavaScript. So I can easily implement a node.js server without changing anything on my AJAX requests. Or rewrite everything I have so far, to js and use only a node.js server.
But this leads to 3 more questions:
Which runs possibly faster on the server side. A calculation scripted with PHP or JavaScript?
How to use transactions on a node.js server while using MySQL?
And how great is the influence by converting a PHP array to a JSON object, what you could avoid with the usage of just the node.js server where you don't need to convert anything.
JavaScript is executed on the client side so you are limited by the user's hardware whereas PHP is executed on your server. See this post for more info about performance comparaison.
I highly suggest you take a look at this pure node.js client that will perfectly do the job in your case.
PHP has many functions to use on JSON data (json_decode(), json_encode(), ...) but Node.js don't require JSON data to be converted. In the end, it really depend on your usage and how you plan to store and use that data
I'm attempting to build a notification system for a PHP application. Every time a booking is placed, we need a notification to appear within a specified user account type inside the application.
I'm using CodeIgniter 2 on a virtual dedicated host, so I'd have the option of requesting the installation of whatever is required to get the job done.
So far, I know that PHP has limited powers over how can trigger jQuery, in that it's limited to the web browser. I know that Node.js and Socket.io can do what I want, but how would that tie in with PHP, if at all?
I also know that a polling mechanism would be bad. I've considered a method that would send the row ID via PHP to a jQuery script within the confirmation page, which could — in theory — accomplish what I have in mind, but this would rely on the web browser of the customer, which is a bit weak.
I've spent a couple of days fumbling around this question, since I'm only just getting to grips with jQuery, while I know hardly anything about Node.js or Socket.io, or what they can and cannot do, or — as mentioned earlier — how they connect with PHP.
Any advice would be welcome.
With real time push methods server pushes data to the clients(channel subscribers) whenever there is an event occurs in the server. This method is advanced than pull method like polling etc, and this will be a live communication(ie, client gets live updates from server with no time. In pull method there is a time interval between each query).
Examples for real time push methods: Faye, pusher, socket.io, slanger
Most of the real time push methods are built on ruby or nodejs. So if you wish to setup your on real time server you must setup them in your server(probably ruby or nodejs) and you can communicate with that server from php using curl statements.
Also there are php libraries available for these operations.
If you like to setup slanger then you can use the pusher php library itself (may be you need to modify it slightly to use with slanger). And if you like to use faye then here is a php library wrote my self: faye php wrapper
You could store notifications in database, with corresponding timestamp.
Then, use long pooling to receive messages in jQuery, that calls PHP for notifications.
Cool example was given in this anwser:
How do I implement basic "Long Polling"?
I'm currently in the process of building/implementing a logging system for a website I'm working on that's in PHP. The way the logging system works is I send a JSON request to localhost and that json gets logged (basically, anyway.
My question is:
what's the fastest way I can make a quick fire and forget call with a JSON POST? Is there a way to fire and forget with cURL?
There are multiple ways to do it: you could use the curl_multi functionality of the php_curl extension, which allows you to send asynchronous HTTP requests using cURL, but this requires that extension. GuzzlePHP provides a large wrapper around much of the functionality of cURL, including the curl_multi features if you are looking for an object-oriented approach. PHP's sockets also support asynchronous communications, a library which implements this for the HTTP protocol is available here [the client is written in "pure" PHP and has no dependency on cURL but supports asynchronous requests and fully complies with the HTTP 1.1 spec].
If you are looking for a fire and forget logging solution you might want to look at something that uses UDP protocol like Graylog.
You could use a small image that hits a PHP script. The php script logs the hit and returns a tiny 1x1 transparent GIF. Then the logging will happen after the page loads.
I've developed an application that I would like to use meteor.js for real time updates (I want to enhance but not change my program, for example when a user adds a comments make it update in real-time ) . Problem is meteor.js uses node.js (so javascript as server-side code). I use LAMP stack, Is it possible to get PHP to feed data into meteor.js from mysql.
Meteor is more than just an 'interactive webapplication'-builder or javascript framework. The idea is to have only one programming language (besides HTML/CSS for markup) to do all the work. Basically it creates a 'remote server' (in the clients browser) it can push data to and at the same time it publishes various API's to the users system. The data passed through these API's / connections has a specific structure which has to be adhered at all time.
Meteor is built around NodeJS, which makes it hard (if not impossible) to run it without this backend. Sure you can try to mimic the backend using PHP, but it would be a waste of time. Reading your question you'll be better of using a javascript framework like jQuery or Prototype. Unlike Meteor you will need to do the AJAX calls (POST & CallBack) yourself, but you can actually decide which backend you want to use yourself (including PHP / MySQL).
If you want to do this anyway you need to check the Meteor & NodeJS source code to see what the minimum requirements are to make Meteor run under PHP. The PHP stack has to interpret the commands Meteor sends and receivers, but this won't be an easy task.
You can use comet (or reverse ajax) for realtime updates.
Trying to marry node.js with PHP doesn't sound like a worthwhile path to go down. If someone insisted on using a system like Meteor.js, yet with a PHP back-end, it would make more sense to look at AngularJS which is mainly the client side.
Of course, that is different technology stack. If someone really insisted on the blending, one could consider using server side sockets to interact with PHP Web services; and/or use mongodb and/or mysql-node to interact with the same databases.
I released a meteorite package that interacts with a Wordpress site that has the Wordpress JSON API. A quick fix. For now.
Comes with a backend call that will return the raw data, or a publication that stores the posts using their id's instead of a randomly generated mongoid. And some basic templates to get you started including a Session variable that keeps track of the currently selected post.
I'm working on it a lot more and will eventually have a version that directly makes mysql calls from node so you won't need php or Wordpress; just the ability to access the mysql database (which can be remote, with the appropriate configuration, or on the same machine).
I'm trying to integrate an old PHP ad management system into a (Django) Python-based web application. The PHP and the Python code are both installed on the same hosts, PHP is executed by mod_php5 and Python through mod_wsgi, usually.
Now I wonder what's the best way to call this PHP ad management code from within my Python code in a most efficient manner (the ad management code has to be called multiple times for each page)?
The solutions I came up with so far, are the following:
Write SOAP interface in PHP for the ad management code and write a SOAP client in Python which then calls the appropriate functions.
The problem I see is, that will slow down the execution of the Python code considerably, since for each page served, multiple SOAP client requests are necessary in the background.
Call the PHP code through os.execvp() or subprocess.Popen() using PHP command line interface.
The problem here is that the PHP code makes use of the Apache environment ($_SERVER vars and other superglobals). I'm not sure if this can be simulated correctly.
Rewrite the ad management code in Python.
This will probably be the last resort. This ad management code just runs and runs, and there is no one remaining who wrote a piece of code for this :) I'd be quite afraid to do this ;)
Any other ideas or hints how this can be done?
Thanks.
How about using AJAX from the browser to load the ads?
For instance (using JQuery):
$(document).ready(function() { $("#apageelement").load("/phpapp/getads.php"); })
This allows you to keep you app almost completely separate from the PHP app.
Best solution is to use server side includes. Most webservers support this.
For example this is how it would be done in nginx:
<!--# include virtual="http://localhost:8080/phpapp/getads.php" -->
Your webserver would then dynamically request from your php backend, and insert it into the response that goes to the client. No javascript necessary, and entirely transparent.
You could also use a borderless <iframe>
I've done this in the past by serving the PHP portions directly via Apache. You could either put them in with your media files, (/site_media/php/) or if you prefer to use something more lightweight for your media server (like lighttpd), you can set up another portion of the site that goes through apache with PHP enabled.
From there, you can either take the ajax route in your templates, or you can load the PHP from your views using urllib(2) or httplib(2). Better yet, wrap the urllib2 call in a templatetag, and call that in your templates.