Best way to communicate from Android app to Linux daemon [closed] - php

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I'm on writing an Android App thats purposed to hand commands (or better data) to a running c daemon on another machine in the same network (should also work from an external network sometimes in the future), but I have problems to choose the best way (or protocol) to do that.
Communicating with an sort of API (PHP, Python, etc.) isn't really a option (maybe I'm wrong with this) because that data is time critical, should be the fastest way thats possible, so I trying to avoid the overhead coming with http and another thing between the daemon and the APP. On the other hand the daemon should be accessible by an local running PHP script too (there should be an API in the future, so maybe the extra "layer" isn't that critical?).
But even if I choose the API solution, what's the best way then? Sockets, general IPC?
Any suggestions or experience with a similar situation would be helpful.

In your question you say that it's time critical but also that it's under the same network. As far as your application doesn't have any performance problems, you won't find any issue with times. It depends also on your daemon though.
I've worked with a lot of even remote daemons and TCP sockets have always been a good option, I've never had any limitations using them, just be sure to choose between implementing a Service if your socket will need to be alive all your app's life cycle, or an AsyncThread or Thread if it's for a limited task.
This is what I use, for instance:
socket = new Socket();
socket.connect(new InetSocketAddress(host, port), timeout);
in = new BufferedReader(new InputStreamReader(socket.getInputStream(), "ISO-8859-1"));

Related

what is the best way to handle real time notification in php? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I want to implement real time notifications like in Facebook. There might be a huge number of notifications to be sent for different users, depending on server load time and efficiency of coding. Which is the best approach?
1. using normal AJAX?
2. with node.js and socket programming?
3. something else?
Thanks in advance.
The choice of the proper platform greatly depends on your current architecture, knowledge, and budget.
Your question suggests that it is web based, for which there are only two basic options:
WebSocket: There exists many WebSocket server solutions including compiled executables, PHP based, and Node.js. This approach is greatly gaining in popularity but isn't necessarily accessible to every budget since it usually requires a complete server to run. VPS limitations are usually too important for systems that require so many simultaneous connections.
AJAX: The use of AJAX and its variants is still a very popular solution and, when well implemented, can be almost as efficient as WebSocket without the need to sustain the connections constantly. It doesn't often matter if there is a one second delay, and Facebook chat is usually much slower than that.
For non web-based solution, anything is possible. If you develop a client-server application, you can have real time connections similar to WebSocket which can be even easier to maintain.
Ajax request not real time but you can set timeout and in this case work.
but the ajax request busy your server and try to many connections. this is simple !
If you use socket it's better but you need more time to develop this.
Read the following link can help you :
Ajax vs Socket.io
Nowadays we have two possible solutions. WebSockets and Comet. WebSockets are probably the best solution but they’ve got two mayor problems:
Not all browsers support them.
Not all proxy servers allows the communications with websokets.
Because of that I prefer to use comet (at least now). It’s not as good as websockets but pretty straightforward ant it works (even on IE).
realtimenotifications more details To know more about refer above link.

Should I build my server with Laravel? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 years ago.
Improve this question
I want to build a server that will listen to a custom port and talk with my web application through a custom protocol. The server will have a dispatcher and workers that will undertake a task and complete it.
Is Laravel up for the job or should I go with something more specific ?
EDIT:
I would like to clarify that it's not an API. Basically a php script will run on loop in CLI mode (meaning no Apache or NGINX involved here). The script will open a socket and listen on a certain port for connections from clients. Once a client connects, the server will start some jobs and send the answer. It also involves a job queue to which the server will connect(probably a database), get the jobs and fork new processes that will complete the jobs.
EDIT:
It seems that you don't need much of a framework at all (except maybe for the database operations part. since you if you use sockets you will (probably) not use much of the framework's functionality like routing, view templating...) Depending on the complexity of your database I'd use a framework or not. If it's very complex, features like Eloqent might help... I think you should think on how much of the framework you will use and if you can only take the stuff you need trough Composer instead.
END EDIT
Should you use Laravel/PHP to build a server - it will be probably too slow for that purpose.
1) If you want to make your own server (not website or API) I'd much rather go for Node.js or something along those lines (ruby, python, C#..)
2) By "custom protocol" I assume you don't mean something different than HTTP/ TCP/IP ? Then what do you mean by a "custom protocol" ?

Large-scale service monitoring at regular intervals [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I need to get the status of services across a large number of servers in order to calculate uptime percentages. I may need to use multiple servers to do the checking. Does anyone know of a reliable way to queue them to be checked at a specific time/interval?
I'm writing the application in PHP, but I'm open to using other languages and tools for this. My only requirement is that it must run on Linux.
I've looked into things like Gearman for job queuing, but I haven't found anything that would work well.
Inorder to get uptime percentages of your services you can execute commands to check status of services and log them for further analysis/calculations. Following are some of the ways of doing same:
System commands like top, free -m, vmstat, iostat, iotop, sar, netstat etc. Nothing comes near these linux utility when you are analysing/debugging a problem. These commands give you a clear picture of what is going inside your server
SeaLion: Agent executes all the commands mentioned in #1 and custom commands as well. Outputs of these commands can be accessed in a beautiful web interface. This tool comes handy when you are working across hundreds of servers as installation is clear simple. And its FREE
Nagios: It is the mother of all monitoring/alerting tools. It is very much customizable but very difficult to setup for beginners. Although there are some nagios plugins.
Munin
Server density: A cloudbased paid service that collects important Linux metrics and gives users ability to write own plugins.
New Relic: Another well known hosted monitoring service.
Zabbix

What is the best way to mirror a DB server? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
I am creating a website and am expecting somewhat normal usage. I am setting up the system for now with 1 Apache Server and 2 DB servers. I want any DB operations to be reflected in both DB servers so that I can have 1 server as Backup. Now how do I do it ?
The ways I can think of are :
Perform same operations in both DB from PHP. This seems like a terrible idea.
Update 1 DB and sync both DB servers periodically. This seems better.
Is there any better way to achieve this ? How is it done in Enterprises ?
If you're using MySQL, there is quite powerful built-in replication.
Check out the docs
A terrible idea is to have backup each time a new operation happens. No modern, nor old application works this way. Even Windows System Restore makes backup on scheduled times, not on each operation.
I'd suggest you to make an sql dump script. And schedule a cron job wich will run it once a day, or twice a day. If you really need the data on the server immediately (assuming, you need if one of the DB servers crashes, your app continue working immediately with the backup server) you can make an import script, which will run once the dump finishes.
If you are not in the special case, when you need once the first DB server is shutdown'd, to have another one opened, you can just store the dumped sql files on the machine and not load them on real database, if they are not needed.

PHP connect to AIM TOC [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
In the middle of 2010, I found a class library called PHPToCLib. It ran flawlessly for over a year - I was able to implement a tremendous amount of my own, custom code into a an AIM Bot that I could run from my CMD Prompt. However, near the end of 2011, the servers stopped responding to the script. It connects to toc.oscar.aol.com on port 5190, and that hasn't been changed. I am indeed aware that AOL discontinued their TOC2 servers and that it's not possible to connect with them anymore. However, I downloaded a program called TerraIM that uses the same specifications and is somehow able to connect to them. I was wondering if there were any updates on how I could get my script to connect, and if so, what do I need to change?
Thank you in advance.
TerraIM also supports the OSCAR protocol which I assume it's defaulting to. If you are working with IM bots the absolute best way to go is to leverage libpurple. Unfortunately there is not a good PHP binding to libpurple. There are a couple python bindings. If you don't wish to migrate your code, there is an implementation that provides an HTTP interface which may be easy to integrate with depending on your use case. Alternatively, you could use thrift to comminute between your existing PHP code and the python bindings -- this would require a bit more coding than leveraging an HTTP interface. Here are some resources you may find helpful:
Python bindings:
github.com/fahhem/python-purple
github.com/Raptr/Heliotrope
HTTP interface from HTTP binding:
github.com/atamurad/http-purple
Thrift:
http://thrift.apache.org/

Categories