Should I build my server with Laravel? [closed] - php

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 years ago.
Improve this question
I want to build a server that will listen to a custom port and talk with my web application through a custom protocol. The server will have a dispatcher and workers that will undertake a task and complete it.
Is Laravel up for the job or should I go with something more specific ?
EDIT:
I would like to clarify that it's not an API. Basically a php script will run on loop in CLI mode (meaning no Apache or NGINX involved here). The script will open a socket and listen on a certain port for connections from clients. Once a client connects, the server will start some jobs and send the answer. It also involves a job queue to which the server will connect(probably a database), get the jobs and fork new processes that will complete the jobs.

EDIT:
It seems that you don't need much of a framework at all (except maybe for the database operations part. since you if you use sockets you will (probably) not use much of the framework's functionality like routing, view templating...) Depending on the complexity of your database I'd use a framework or not. If it's very complex, features like Eloqent might help... I think you should think on how much of the framework you will use and if you can only take the stuff you need trough Composer instead.
END EDIT
Should you use Laravel/PHP to build a server - it will be probably too slow for that purpose.
1) If you want to make your own server (not website or API) I'd much rather go for Node.js or something along those lines (ruby, python, C#..)
2) By "custom protocol" I assume you don't mean something different than HTTP/ TCP/IP ? Then what do you mean by a "custom protocol" ?

Related

Use of containers for server side [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 2 years ago.
Improve this question
I am a newbie and trying to create a CGI server. The server will get request from different clients (the server will have to open chrome and do something, so different chrome profiles for every client). I will receive requests using PHP. What is the best practice to do so? Is docker to be used in this case?
As was stated in the comment, there are lots of possibilities and the best practice could vary quite a bit by company, architecture, and tech stack.
But all things being equal, I can say that this is typically a good use case for Docker. And it'd be a good place to start.
Other options:
Going with a non-containerized approach... Run the CGI server directly on a host (could be physical for virtual). One tradoff here is that the host's environment must be configured to support the server vs. the just the container's environment.
Doubling down on a containerzied approach... If you plan on running multiple workloads across different containers (i.e. beyond a single CGI server), you may look at utilizing a container orchestrator, the defacto being Kubernetes.

Update users in realtime (Web) [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I want to make an application where users use their own computer together with a host, the host click ”Start” and then there shall automatically show a button on all users computers at the same time. Then the first person who click the button will win. I want this to happen in the browser, but I dont know wich technology to use. I already know PHP and mysql, but I dont know anyway to update users computers in realtime. Wich technology would be the best choice to make this happen?
The solution here is basically web sockets, likely with a pub/sub layer on top. It can be done relatively simply with a decent Javascript and server-side library. PHP isn't the ideal language for this, but it works just fine with the right tools. Ratchet is a decent PHP web socket server implementation, and Autobahn|JS a decent client-side library (note: at the time of writing the latest Autobahn|JS WAMP implementation is incompatible with the older WAMP implementation of Ratchet, use Autobahn|JS WAMP v1). Follow the Ratchet tutorial, then expand into setting up a pub/sub server as described here (you don't need the ZeroMQ components, you'll be triggering events by a publish action instead of an external ZeroMQ event).
That's a 30,000 foot overview, go forth and try it.
Pusher.com has the ideal solution for this. You can send events, listen to these events and then respond accordingly. They have a free plan which I think is way more than you will probably need. Pusher works with JavaScript and it's extremely simple to get started
I suggest reading up on the documentation at pusher.com/docs

Best way to communicate from Android app to Linux daemon [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I'm on writing an Android App thats purposed to hand commands (or better data) to a running c daemon on another machine in the same network (should also work from an external network sometimes in the future), but I have problems to choose the best way (or protocol) to do that.
Communicating with an sort of API (PHP, Python, etc.) isn't really a option (maybe I'm wrong with this) because that data is time critical, should be the fastest way thats possible, so I trying to avoid the overhead coming with http and another thing between the daemon and the APP. On the other hand the daemon should be accessible by an local running PHP script too (there should be an API in the future, so maybe the extra "layer" isn't that critical?).
But even if I choose the API solution, what's the best way then? Sockets, general IPC?
Any suggestions or experience with a similar situation would be helpful.
In your question you say that it's time critical but also that it's under the same network. As far as your application doesn't have any performance problems, you won't find any issue with times. It depends also on your daemon though.
I've worked with a lot of even remote daemons and TCP sockets have always been a good option, I've never had any limitations using them, just be sure to choose between implementing a Service if your socket will need to be alive all your app's life cycle, or an AsyncThread or Thread if it's for a limited task.
This is what I use, for instance:
socket = new Socket();
socket.connect(new InetSocketAddress(host, port), timeout);
in = new BufferedReader(new InputStreamReader(socket.getInputStream(), "ISO-8859-1"));

Large-scale service monitoring at regular intervals [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I need to get the status of services across a large number of servers in order to calculate uptime percentages. I may need to use multiple servers to do the checking. Does anyone know of a reliable way to queue them to be checked at a specific time/interval?
I'm writing the application in PHP, but I'm open to using other languages and tools for this. My only requirement is that it must run on Linux.
I've looked into things like Gearman for job queuing, but I haven't found anything that would work well.
Inorder to get uptime percentages of your services you can execute commands to check status of services and log them for further analysis/calculations. Following are some of the ways of doing same:
System commands like top, free -m, vmstat, iostat, iotop, sar, netstat etc. Nothing comes near these linux utility when you are analysing/debugging a problem. These commands give you a clear picture of what is going inside your server
SeaLion: Agent executes all the commands mentioned in #1 and custom commands as well. Outputs of these commands can be accessed in a beautiful web interface. This tool comes handy when you are working across hundreds of servers as installation is clear simple. And its FREE
Nagios: It is the mother of all monitoring/alerting tools. It is very much customizable but very difficult to setup for beginners. Although there are some nagios plugins.
Munin
Server density: A cloudbased paid service that collects important Linux metrics and gives users ability to write own plugins.
New Relic: Another well known hosted monitoring service.
Zabbix

PHP connect to AIM TOC [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
In the middle of 2010, I found a class library called PHPToCLib. It ran flawlessly for over a year - I was able to implement a tremendous amount of my own, custom code into a an AIM Bot that I could run from my CMD Prompt. However, near the end of 2011, the servers stopped responding to the script. It connects to toc.oscar.aol.com on port 5190, and that hasn't been changed. I am indeed aware that AOL discontinued their TOC2 servers and that it's not possible to connect with them anymore. However, I downloaded a program called TerraIM that uses the same specifications and is somehow able to connect to them. I was wondering if there were any updates on how I could get my script to connect, and if so, what do I need to change?
Thank you in advance.
TerraIM also supports the OSCAR protocol which I assume it's defaulting to. If you are working with IM bots the absolute best way to go is to leverage libpurple. Unfortunately there is not a good PHP binding to libpurple. There are a couple python bindings. If you don't wish to migrate your code, there is an implementation that provides an HTTP interface which may be easy to integrate with depending on your use case. Alternatively, you could use thrift to comminute between your existing PHP code and the python bindings -- this would require a bit more coding than leveraging an HTTP interface. Here are some resources you may find helpful:
Python bindings:
github.com/fahhem/python-purple
github.com/Raptr/Heliotrope
HTTP interface from HTTP binding:
github.com/atamurad/http-purple
Thrift:
http://thrift.apache.org/

Categories