Lightweight fast PHP Logging - php

Suppose we need to log some data on every php call on each request being made to the server on a high traffic- request heavy web application, to basically trace the actions taken by each client.
I was considering saving them to memory and then logging all in one go to prevent frequent disk access.
Is there a Php framework which already does this which I can reuse?
I need to do this on the actual production server so I dont want to use stuff like xdebug.

Redis would be my suggestion.
There is a php library Rediska, that makes use of Redis server. At some point data can be dumped to db, i.e. when server load isn't at it highest.

I would use apc_store, i use it for light php request time analyzer
https://gist.github.com/aiphee/8004486cbd37b3f13efd271b8457cb38

Related

PHP - cross-application communication

I have two PHP applications on my server. One of them has RESTAPI which I would like to consume and render in the second application. What is better way then curling the API? Can I somehow ask php-fpm for the data directly or something like that?
Doing curl and making request through the webserver seems wrong.
All this happens on single server - I know it probably doesn't scale well but its small project.
why use REST if you can access the functions directly?
If everything is on the same server then there is no need for some REST, since it makes a somewhat pointless run through the webserver.
But if it is already there and you don't care about the overhead (if there's not much traffic going on then it would make sense), then use file_get_contents instead of curl, it is easier to use, but I doubt it is faster/slower; both are right.
You could also use a second webserver (a second virtualhost) on a different port for internal use. That way things are nicely separated.
(If everything is on different servers, but a local network, then using sockets would be fastest. )
Doing curl and making request through the webserver seems wrong. - I disagree with that. You can still achieve what you want to achieve using Php CURL, even if it's on the same server.
I was in the same problem, but i solved it using MySQL to "queue" tasks, and a worker could use any pooling method, or PHP executing a new server side worker.
Since the results were stored in the same database, the PHP pages could load the results, or the status anytime.

How do i check for a change in a file that has been included in an HTML doc through an AJAX script?

I am writing a JavaScript for an in-browser IM client for the sake of practicing and learning JavaScript and AJAX.
I need to be able to check for a change in the file size of a text file that is being used as a temporary storage for 40-80 SQL entries that contain messages so that it can update the display.
At the moment I am using a setInterval function to periodically check for a change in file size using short PHP script, but this can cause issues, if the interval is to long, messages are delayed, if it is shorter, it means a lot of php scripts running very quickly, which takes up server resources.
What is the best way to do this if the main concern is to reduce server resource usage?
(I am running my server off of a rather low tech PC I've scraped together(2gb ram, 2.8ghz AMD seperon processor))
Preferably, I would want to do this using an AJAX event triggered by someone sending a message, I.E. When user B triggers the event that edits the file by pressing enter, that triggers a function on user A's side that updates the HTML file
Any ideas? I am open to any solution to this particular problem. I gave specific examples of what I want to happen in the specific languages in order to give a better idea of what it is I am attempting to do.
If there is a way to do this that isn't JavaScript/PHP, I'd also be open to exploring that as an option.
Doing this with PHP can be a bit cumbersome. You could try doing something like long polling where you keep the HTTP request open until the server has new data to send to the user. If messages are sent frequently, this might not be ideal. You might want to consider using event-driven web technologies like node.js with something like Socket.IO.
In any case, you'll likely want to maintain a connection with the server if you want to get the message in near real-time. There are ways to use WebSockets with PHP as well, but PHP isn't really the best for this because it's not designed to keep scripts running for long periods (also see What exactly entails setting up a PHP Websocket Server?).
Browsers & HTTP/ AJAX generally work by a "pull" model. The browser/ or AJAX sends the server a request, then the server answers a response.
There isn't generally much provision for the server to contact the browser, to "push" an event. This can however be simulated by a long-running request, to which the server writes data when the event/ or events occur.
For example, this could be a request that answers "empty" after a timeout of 10-30 seconds.. or the server returns & answers immediately, if there are event(s) in its queue.
With a Java server this is easy to do, and I've used this successfully for event notification in a major integration project a few years back.
However I'm not sure in PHP how much ability there is (probably very near zero) to maintain an overall server state, coordinate or communicate between threads/requests, or maintain event queues.
You could look into something like a Java webapp running on Tomcat. All you need is a basic web.xml and one Servlet class, and you can build just about anything from there.

Quick writing to log file after http request

I currently finished building a Web server who's main responsibility is to simply take the contents of the body data in each http post request and write it to a log file. The contents of the post data is obfuscated when received. So i'm un obfuscating the post data and writing it to a log file on the server. The contents after obfuscated is a series of random key value pairs that differ between every request. It is not fixed data.
The server is running Linux with 2.6+ kernel. Server is configured to handle heavy traffic (open files limit 32k, etc). The application is written in Python using web.py framework. The http server is Gunicorn behind Nginx.
After using Apache Benchmark to do some load testing, I noticed that it can handle up to about 600-700 requests per second without any log writing issues. Linux natively does a good job at buffering. Problems start to occur when more than this many requests per second attempt to write to the same file at same moment. Data will not get written and information will be lost. I know that "the writing directly to a file" design might not have been the right solution from the get go.
So i'm wondering if anyone can propose a solution that I can implement quickly without altering too much infrastructure and code that can overcome this problem?
I have read about in memory storage like Redis, but I have realized that if data is sitting in memory during server failure then that data is lost. I have read in the docs that redis can be configured as a persistent store, there just needs to be enough memory on the server for Redis to do it. This solution would mean that I would have to write a script that would dump the data from Redis (memory) to the Log file at a certain interval.
I am wondering if there is even a quicker solution? Any help would be greatly appreciated!
One possible option what I can think of is a separate logging process. So that your web.py can be shielded for performance issue. This is classical way of handling logging module. You can use IPC or any other bus communication infrastructure. With this you will be able to address two issues -
Logging will not be a huge bottle neck for high capacity call flows.
A separate module can ensure/provide switch off/on facility.
As such there would not be any huge/significant process memory usage.
However, you should bear in mind below points -
You need be sure that logging is restricted to just logging. It must not be a data store for business processing. Else you may have many synchronization problem in your business logic.
The logging process (here I mean actual Unix process) will become critical and slightly complex (i.e you may have to handle a form of IPC).
HTH!

AJAX to get data from the server

A page is sending AJAX call to server and should get item info in response. The array to look-up/return is a rather big one and I can’t hold it in the PHP file to accept the request. So, as far as my knowledge and experience tell, there are 2 methods:
Access database for each request.
Store items in files (e.g. “item12.txt”) and send contents to the user.
My C experience says that opening and closing a file takes much more system time than the rest of the program. How is it in PHP? What is the preferred method (most importantly, resource-wise) – file system or database? Is there any other way you would recommend (e.g. JavaScript directly loading the file with variable array from the server for each request)? Maybe there’s some innovative method lying around you’re aware of?
P.S. On the server-side a number only will be accepted, so no worries regarding someone trying to access files in the server or trying to do some fancy stuff on database.
Sockets
Depending on how many requests you will be handling, you could look into socket connections.
Sockets gives you 2 way communication between the client and the server, which would allow you to do interactive things, as needed.
Socket tutorial 1
Socket tutorial 2
Node.js
node.js is the new kid on the block. You write your own socket webserver, and use javascript to communicate with it. This is a great alternative to Ajax, as it's much more efficient and reliabe.
node.js can be run alongside PHP, and only be used for ajax-like calls.
node.js
node.js socket turotial
There are nothing innovative. If you have low frequency calls to data and you want super simple access to data then use files. But today is much better to use any database (SQL lite) is ok i think. IF you need more performance then use MySQL or NoSQL solutions. Tools made to solve things. Use the right tool for your purpose.

How to stream the contents of a file live to a browser

I'm trying to find a efficient way to watch the server log on a webpage, i don't mind building an app i just can't work out the best way to do it.
Is there a way to keep a stream open to a file with php and to the browser? or will it have to be done by polling the file every x seconds?
Thanks in advance,
Shadi
The best solution is definitely AJAX in some capacity. The only way to have the server "push" to you the way you describe (maintain an open stream) would require the HTTP connection to remain open which would ultimately trigger timeouts and consume a lot of resources. I would look into the Cometd library. The downside to this is that I believe it depends on Java although the site does mention perl, python and "other languages." In the worst case, you could use a specific jetty implementation just for log monitoring on a specific port. Regardless, that framework would most likely be your best bet.
Any web-based chat mechanism essentially uses a push architecture and would be good to look at for some inspiration. In this case, instead of users creating messages that are fired to other users, the server creates the events (when a log message is generated). Check out this article on Facebook chat for some insight into how they do it. Google chat might be worth looking into if you can find some stuff on the architecture.
For the actual logging, I'm not sure if you are in need of help for that, but log4php which is currently under incubation might be a good place to start as it provides you with a configuration that can simultaneously log to an arbitrary number of "loggers" like database, file, socket, etc. You could likely find one that would allow you to tie it into whatever push framework you elect to use.
Good luck!
Remember that the web model is essentially stateless (disconnected). Having that in mind when a client submits a request, the server processes the request and then send a response accordingly. You can have track of the clients action using cookies and/or sessions, but the resources reserved for a request are released after the response is submitted back.
I think that the best way to meet your goal, is to develop a web services that checks for the status of the log and fetch the diff (if any). Your app may consist of a web page with a div that will display the diff from the web service.
A script with a timer will trigger the call to the web service.
I will try to do something like this in a few weeks, and I will post the entire solution on moropo blog (spanish). You can ask for a post translation using the comments.
The best way to do it is to use AJAX to pull the file content every x seconds, giving the illusion of real time.
If you do want real time, you can use an XMPP server, but from what I can see, the first solution is far sufficient and does't require a lot of work.
Try wonlog.
https://www.npmjs.com/package/wonlog
You can stream multiple log files to a web browser.

Categories