Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I need to get the status of services across a large number of servers in order to calculate uptime percentages. I may need to use multiple servers to do the checking. Does anyone know of a reliable way to queue them to be checked at a specific time/interval?
I'm writing the application in PHP, but I'm open to using other languages and tools for this. My only requirement is that it must run on Linux.
I've looked into things like Gearman for job queuing, but I haven't found anything that would work well.
Inorder to get uptime percentages of your services you can execute commands to check status of services and log them for further analysis/calculations. Following are some of the ways of doing same:
System commands like top, free -m, vmstat, iostat, iotop, sar, netstat etc. Nothing comes near these linux utility when you are analysing/debugging a problem. These commands give you a clear picture of what is going inside your server
SeaLion: Agent executes all the commands mentioned in #1 and custom commands as well. Outputs of these commands can be accessed in a beautiful web interface. This tool comes handy when you are working across hundreds of servers as installation is clear simple. And its FREE
Nagios: It is the mother of all monitoring/alerting tools. It is very much customizable but very difficult to setup for beginners. Although there are some nagios plugins.
Munin
Server density: A cloudbased paid service that collects important Linux metrics and gives users ability to write own plugins.
New Relic: Another well known hosted monitoring service.
Zabbix
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 2 years ago.
Improve this question
There is an application built on Laravel and the application should be ready for a load of 1000 requests per second.
I have done the below tasks:
1- Composer autoload has been dumped
2- Query results are cached
3- All views have been minified
What else should I consider ?
(App runs on docker container)
How are you measuring if you reach the TPS? I would first get a baseline in order to know if your far of and based on that start looking into which part of your application stack (this includes the web and database server and other services used.) Tools that are available to use are JMeter or Apache Bench
In order to reach the 1000 TPS you'll need to tweak the webserver to allows for this type of loads. How to approach this is dependent on the webserver used. So it is difficult to provide you with specifics.
With regards to your DB server there are tools available to benchmark them as well such as pgBadger(postgres) or log files specific for the slow queries.
Ultimately you would also like to be on one of the latests PHP version as they are quite some performance gains in every new version. Currently the latest released PHP version is 7.4
In my opinion these tweaks would have a greater performance gain then tweaking the PHP code (assuming there is no mis-use of php). But this of course depends on the specifics of you application.
Optionally you should also be able to scale vertically (oppose of horizontally) to increase the TPS every time with the number of TPS per application server.
Tips to Improve Laravel Performance
Config caching,
Routes caching.
Remove Unused Service.
Classmap optimization.
Optimizing the composer autoload.
Limit Use Of Plugins.
Here is full detailed article click
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 years ago.
Improve this question
I want to build a server that will listen to a custom port and talk with my web application through a custom protocol. The server will have a dispatcher and workers that will undertake a task and complete it.
Is Laravel up for the job or should I go with something more specific ?
EDIT:
I would like to clarify that it's not an API. Basically a php script will run on loop in CLI mode (meaning no Apache or NGINX involved here). The script will open a socket and listen on a certain port for connections from clients. Once a client connects, the server will start some jobs and send the answer. It also involves a job queue to which the server will connect(probably a database), get the jobs and fork new processes that will complete the jobs.
EDIT:
It seems that you don't need much of a framework at all (except maybe for the database operations part. since you if you use sockets you will (probably) not use much of the framework's functionality like routing, view templating...) Depending on the complexity of your database I'd use a framework or not. If it's very complex, features like Eloqent might help... I think you should think on how much of the framework you will use and if you can only take the stuff you need trough Composer instead.
END EDIT
Should you use Laravel/PHP to build a server - it will be probably too slow for that purpose.
1) If you want to make your own server (not website or API) I'd much rather go for Node.js or something along those lines (ruby, python, C#..)
2) By "custom protocol" I assume you don't mean something different than HTTP/ TCP/IP ? Then what do you mean by a "custom protocol" ?
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
Is it possible, on a website/webserver (having full root access) to run a PHP script which calls mysql queries in the background. What I mean with that:
An user clicks to process something - however to prevent the user waiting for running the query it should look like it is done for the user - he doesn't has to wait for the PHP/MYSQL in the browser
However the script should be running on the server and finish
How can I do that? If there is none effective solution in PHP - is it possible with other languages?
I'm not talking about cron jobs - I'm on a ubuntu machine (no windows)
Would be for running many PHP scripts (all the same) in the background - Nginx be the better solution or Apache? Is it even relevant?
The best architecture I could recommend here is probably a queue/worker setup. For instance, this is simple to do with Gearman (alternatively: ØMQ, RabbitMQ or similar advanced queues). You spin up a number of workers which run in the background and can handle the database query (I'm partial to daemonizing them with supervisord). Spin up as many as you want to support running in parallel; since the job is apparently somewhat taxing, you want to carefully control the number of running workers. Then any time you need to run this job, you fire off an asynchronous job for the Gearman workers and then return immediately to your user. The workers will handle the request whenever they get around to do it. This assumes you don't need any particular feedback for the user, that the job can simply finish whenever without anybody needing to know about it immediately.
If you do need to provide user feedback when the job is finished, you may simply want to try to execute the request via AJAX. For really sophisticated setups and realtime feedback, you may use the Gearman approach with feedback delivered via a pub/sub websocket. That's quite an involved setup though.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I want to make an application where users use their own computer together with a host, the host click ”Start” and then there shall automatically show a button on all users computers at the same time. Then the first person who click the button will win. I want this to happen in the browser, but I dont know wich technology to use. I already know PHP and mysql, but I dont know anyway to update users computers in realtime. Wich technology would be the best choice to make this happen?
The solution here is basically web sockets, likely with a pub/sub layer on top. It can be done relatively simply with a decent Javascript and server-side library. PHP isn't the ideal language for this, but it works just fine with the right tools. Ratchet is a decent PHP web socket server implementation, and Autobahn|JS a decent client-side library (note: at the time of writing the latest Autobahn|JS WAMP implementation is incompatible with the older WAMP implementation of Ratchet, use Autobahn|JS WAMP v1). Follow the Ratchet tutorial, then expand into setting up a pub/sub server as described here (you don't need the ZeroMQ components, you'll be triggering events by a publish action instead of an external ZeroMQ event).
That's a 30,000 foot overview, go forth and try it.
Pusher.com has the ideal solution for this. You can send events, listen to these events and then respond accordingly. They have a free plan which I think is way more than you will probably need. Pusher works with JavaScript and it's extremely simple to get started
I suggest reading up on the documentation at pusher.com/docs
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
In the middle of 2010, I found a class library called PHPToCLib. It ran flawlessly for over a year - I was able to implement a tremendous amount of my own, custom code into a an AIM Bot that I could run from my CMD Prompt. However, near the end of 2011, the servers stopped responding to the script. It connects to toc.oscar.aol.com on port 5190, and that hasn't been changed. I am indeed aware that AOL discontinued their TOC2 servers and that it's not possible to connect with them anymore. However, I downloaded a program called TerraIM that uses the same specifications and is somehow able to connect to them. I was wondering if there were any updates on how I could get my script to connect, and if so, what do I need to change?
Thank you in advance.
TerraIM also supports the OSCAR protocol which I assume it's defaulting to. If you are working with IM bots the absolute best way to go is to leverage libpurple. Unfortunately there is not a good PHP binding to libpurple. There are a couple python bindings. If you don't wish to migrate your code, there is an implementation that provides an HTTP interface which may be easy to integrate with depending on your use case. Alternatively, you could use thrift to comminute between your existing PHP code and the python bindings -- this would require a bit more coding than leveraging an HTTP interface. Here are some resources you may find helpful:
Python bindings:
github.com/fahhem/python-purple
github.com/Raptr/Heliotrope
HTTP interface from HTTP binding:
github.com/atamurad/http-purple
Thrift:
http://thrift.apache.org/