so im building a application that monitors rasp pi based devices on a network. the devices are running a program that provides a statistical array about the devices performance that we need to log, you can access this array via a socket connection to the device on a port. The network currently has 100 of these devices but will soon grow to several hundreds of devices on a single network.
Currently the application approached this by deploying a script via ssh2_scp to each of the devices, then the application running through the list of local ips and using stream_context_create && get_file_contents to ping the monitoring file on the remote devices. The monitoring file, then gets the stat array from the local machine then $_POSTS the data back to the app which stores this in the db.
This is not really ideal at the moment as im recording it takes around 1.45mins to cycle through the ip's check them (in a cheat fashion using a counter $i++ and while to cycle through a range of numbers rather then getting all the ips from the database which it will need to do when more ips are added and new locations) and retrieve the results and insert them into the db, with the cron job set to run the ping script every 2mins, as the number of devices increases this will go over the 2minute periodical gap and start to get backlogs of data. The problem with this is there isn't really any method of checking weather the stream get contexts retrieves any data, or to check if that device is operating correctly from the data submitted back separately.
The server the application is sitting on is a massive beast so computational power on that side is not a problem, but on the rasp pi devices it's running id prefer it not to run any web server, at the most maybe the inbuilt php web server, but id prefer them not run any web server for security, as well as the fact their primary aim is not a web server.
I've been looking at running php daemon services from the command line, and wondering if its better suited to run the monitor application as a daemon services to establish socket connections directly to the machines to retrieve data back and forth. If i was to go down this road how would i approach it, would i create a daemon script for the monitored devices that listened on a port and returned the stat array through that, then the monitoring application daemon to establish connections to each devices and feed the data in?
Any advice on best method/most efficient way of doing this highly appreciated
you can access this array via a socket connection to the device on a port.
I would simply create a script that accepted a range of hosts as a parameter, then hit this port with netcat. The script iterates through the specified hosts, and dumps results to the database. This alleviates any rasp pi side execution. For scalability, simply run multiple of the scripts, each with different ranges of devices simultaneously on your beastly server.
Related
We have made the backend of a mobile AP in laravel and mysql. The application is hosted on AWS Ec2 and using RDS mysql database.
We are stress testing the app using jmeter. When we send upto 1000 API requests from jmeter, it seems to work fine. However, when we send more than 1000 (roughly) requests in parallel, The jmeter starts getting internal server error (500) as a response for many requests. the internal 500 error percentage increases as we increase the number of APIs
Normally, we would expect that if we increase the APIs, they should be queued and the response should slow if the server is out of resources. We also monitored the resources on the server and they never reached even 50% of the available resources
Is there any timeout setting or any other possible setting that I could tweak so that the we dont get the internal server error before reaching 80% of the resource usage
Regards
Syed
500 is the externally visible symptom of some sort of failure in the server delivering your API. You should look at the error log of that server to see details of the failure.
If you are using php scripts to deliver the API, your mysql (rds) server may be running out of connections. Here's how that might work.
A php-driven web server under heavy load runs a lot of php instances. Each php instance opens up one or more connections to the mysql server. When there are too many php instances x connections per instance the mysql server starts refusing more of them.
Here's what you need to do: restrict the number of php instances your web server is allowed to use at a time. When you restrict that number, incoming requests will queue up (in the TCP connect queue of your OS's communication stack). Then, when an instance is available to serve each item in the queue it will do so.
Apache has a MaxRequestWorkers parameter, with a default extremely large value of 256. Try setting it much lower, for example to 32, and see whether your problem changes.
If you can shrink the number of request workers, you paradoxically may improve high-load performance. Serializing many requests often generates better throughput than trying to do many of them in parallel.
The same goes for the number of active connections to your MySQL server. It obviously depends on the nature of the queries you use, but generally speaking fewer concurrent queries improves performance. So, you won't solve a real-world problem by adding MySQL connections.
You should be aware that the kind of load imposed by server-hammering tools like jmeter is not representative of real world load. 1000 simultaneous jmeter operations without failure is a very good result. If your load-testing setup is robust and powerful, you will always be able to bring your server system to its knees. So, deciding when to stop is an important part of a load testing plan. If this were my system I would stop at 1000 for now.
For your app to be robust in the field, you probably should program it to respond to 500 status by waiting a random amount of time and trying again.
First of all, i'm using pthreads. So the scenario is this: There are servers of a game that send logs over UDP to an ip and port you give them. I'm building an application that will receive those logs, process them and insert them in a mysql database. Since i'm using blocking sockets because the number of servers will never go over 20-30, I'm thinking that i will create a thread for each socket that will receive and process logs for that socket. All the mysql infromation that needs to be inserted in the database will be send to a redis queue where it will get processed by another php running. Is this ok, or better, is it reliable ?
Don't use php for long running processes (php script used for inserting in your graph). The language is designed for web requests (which die after a couple of ms or max seconds). You will run into memory problems all the time.
I'm trying to have 100 Android devices that display text string based on a server parameter. When the server has the text changed from "Hello World" to "Everything Changed" I want all 100 android devices to update simultaneously and ideally instantly as soon as the change happens.
It runs on an isolated LAN so C2DM isn't feasible and polling every second seems rather traffic heavy (especially if there are 1000 devices later). Are there any recommendations on how to move from polling to pushing or at least making this scalable?
I've been considering just keeping the connection open and the server returns content only when it changes but worried about timeout issues and PHP's capability of handling 100's of concurrent connections... Any pointers or advice to try?
You should not pull. If you are in private networks, and amount of devices is limited, you better keep tcp sockets opened, and send data from server to client via opened socket.
But you must understand what are you doing.
So read following:
1) to understand how many connections can be opened and is it enough for your needs
How many socket connections possible?
2) if you have a lot of devices, I mean more than thousand, you may failed on serverside. To not failed you must read about async io
http://en.wikipedia.org/wiki/Asynchronous_I/O and some other found in the Net.
and async io in php Can PHP asynchronously use sockets?
What is the best way to setup a web application to check free RAM on the server and keep users into a waiting queue until sufficient RAM is available again?
I think fetching free RAM on server would only be possible using exec(), right?
I want to enforce this system in my web application as my web application makes use of a lot of RAM and during high traffic i dont want my server to get halted.
Thanks.
You should separate the part of your code that handles web requests and the part that does the resource-intensive work. When you get the web request, put the job into a queue, which separate processes pull jobs off of and do the work. You can have the user on the webpage poll your server every X seconds with AJAX until their job has been processed then update their page.
i am working on an art/programming project that involves using a lab of 30 imacs. i want to synchronize them in a way that will allow me to execute a script on each of them at the very same time.
the final product is in flash player, but if i am able to synchronize an type of data signal through a web page, i'd be able to run the script at the same time. so far my attempts have all had fatal flaws.
the network i'm using is somewhat limited. i don't have admin privileges but i don't think it matters really. i log into my user account on all 30 imacs, run the page or script so i can run my wares.
my first attempts involved running flash player directly.
at first i tried using the system time and had the script run every two minutes. this wasn't reliable because even though the time in my user account is synced there is discrepancy between imacs. a quarter of a second is too much even.
my next try involved having one mac acting as the host which writes a variable to a text file. all other 29 flash players checked for changes in this file multiple times a second. this didn't work. it would work with 3 or 4 computers but then would be flaky. the strain on the server was too great and flash is just unreliable. i figured i'd try using local shared objects but that wasn't reliable. i tried having the host computer write to 30 files and have each mac read only one each but that didn't work either. i tried using local connection but it is not made for more than two computers.
my next try involved having a php server time script run on my web server and have the 30 computers check the time of that files nearly 30 seconds. i don't think my hosting plans supports this because the server would just stop working after a few seconds. too many requests or something.
although i haven't had success with a remote server, it will probably be more reliable with another clever method.
i do have one kludge solution as a last straw (you might laugh): i would take an audio wire and buy 29 audio splitters and plug all of them in. then i would run flash player locally and have it execute when it hears a sound. i've done this before. all you have to do is touch the other end of the wire and the finger static is enough to set it off.
what can i do now? i've been working on this project on and off for a year and just want to get it going. if i can get a web page synchronized on 30 computers in a lab i could just pass data to flash and it would likely work. i'm more confident with a remote server but if i can do it using the local mac network, that would be great.
Ok, here is how i approached my problem using socket connection with flash and php. Basically, first you setup a client script that is to be installed on all 30 imac 'client' machines. lets assume all machines are on a private network. When these clients are activated they are connected to a server(php), by using socket. The php server script would have an ip and a port that these clients connects to, handles client connections pool, message routing and etc, and the server will be running at all time. The socket connection allows the server-client interaction by sending messages back and forth, and these messages can trigger things to do. You should read up more on socket connection/server client interaction. This is just a little summary of how i got my project done.
Simple tutorial on socket/server client connection using php and flash