I'm working on a php websocket chat app and everything's working fine, but the problem is I went to a hosting company to get a hosting plan and after I've uploaded my app websocket didn't worked because I didn't find a way to run server.php file.
On my local machine, I was able to run server.php file from the terminal.
What I wanna know that, Is there any way to create a php file act like server.php?
You need server hosting with access to shell/bash console. This is usually unavailable on shared ones. The basic hosting that would allow you to run this is a VPS or advanced a dedicated one.
But with VPS you might run into problems with hosting company if your script takes too much resources or the VPS instance is configured so that it will kill 'lingering' processes (depends on VM and configuration).
Related
Does anyone know a solution for deploying a PHP webapp behind a firewall on mainly Windows servers? We have 100+ customers who host our webapp on premise, and we would like to setup a deployer, as a part of our bitbucket pipeline, so our code gets deployed on all installations.
1 customer = 1 installation aka deployment
Today we use a small PHP script, and some version control software, to pull code changes once every day. It runs on both Linux and Windows servers.
Hit me with any solutions :)
You can make use of PHPDeployer.
You can setup SSH-access on the servers and then configure the script to deploy to the desired IP of the server.
I'm working on a fun little PHP web application (a manager for my repeating, daily tasks), mostly for the exercise. The 'production' server is a bit limiting, and I cannot view httpd's (Apache's) error logs there, so I've set up my own local httpd as a development server (just good sense). However, my web app makes use of a MySQL database. I will create a local one eventually, but I thought, to make things easier to start, I would just use the remote one.
SQLSTATE[HY000] [2002] Permission denied.
This is what came back each time I tried running the web application from my local httpd. I'm using PHP's PDO database interface, and its mysql driver, and it works when deployed on the remote server. I made sure that my remote server had permissions for my local user. I tested connecting from my local machine from the mysql client, and it worked. I tested the PHP connection statement from the command-line and ... it worked. It is only causing a problem when running within the web application.
Please tell me how to solve this issue. This is the site on which i am getting error pickprogress.com
Some various server provider gives us their database server name. And instead of writing localhost or 127.0.0.1 we have to write their given server name.
I was trying to solve this problem since last 8 hours but didn't found single solution but anyhow I have now solved it.
I am experimenting with using Redis for a Drupal website, hosted on Ubuntu 14.04.
I have installed the redis drupal module and am using the Predis library. I have also installed the 'redis-server' Ubuntu package and left the default configuration.
Configuring the Drupal site to use Redis for its cache backend works fine and the pages are lightning fast.
The problem arrives when I tried to spark up an m3.medium AWS instance and hosting the redis server there. The reason behind this is so that we can use one redis server and connect to it from multiple servers (live website hosted on multiple instances behind a load balancer, so each instance should connect to the same redis server).
I have set up the redis server on the instance, modified the redis.conf file to bind the correct IP address so it can be accessed from the outside, opened up the 6379 port, then tried connecting to it from my local computer
redis-cli -h IP
It worked fine so I decided to flip my local site's configuration to point to the new redis server.
The moment I did that the site became painfully slow, and at first I thought it might not even load at all. After almost a minute it finally loaded the home page. Clicking around the site was almost as slow, but the time reduced to maybe 10-15 seconds. That it still unacceptable and doesn't even compare to the lightning fast page load when using the redis server.
My question is: is there some specific configuration I need to do to make the remote connection faster? Is there something preventing it from performing well? some bottleneck somewhere?
Let me know if you want me to add the drupal settings.php configuration, although I am using a pretty standard config.
Although I ran the same configuration for a php application as you are trying, I had no issues hosting redis on either a small or medium instance and handling large amounts of traffic. There must be a config issue somewhere. Another option to debug it would be to try switching to Elasticcache (AWS' redis offering) it requires that all clients be within the same region, but could make finding your problem very easy.
I have been struggling with this for a long time now so I decided to ask it here.
I want to use PhantomJS on my host that does not have root access. Right now I am using 000webhost.com to test things, it is Apache ver. 2.2.19 (Unix) and I uploaded phantomjs-1.9.1-linux-i686.tar.bz2 to the pulic_html/phantomjs-1.9.1-linux-i686.tar/ file.
Should this work or do I need root access to use PhantomJS.
If it should work is it because my php executing it is wrong exec('http://example.com/phantomjs-1.9.1-linux-i686.tar/phantomjs-1.9.1-linux-i686.tar.bz2 http://example.com/countdown.js');
Will VPS hosting with root access work.
As you can see I am really confused and any help will be great.
Usually executing applications in a shared hosting environment isn't the best practice and your hosting provider may not allow it. I would recommend getting a VPS if you need to run PhantomJS since you will have full control over how you run your applications.
Running an application as root and calling it from PHP is a bad idea. If your system is compromised, then the attacker has root access. Create an account with restricted privileges on your VPS for running PhantomJS. Then you can proceed to call PhantomJS from PHP using exec.
I need to set up cron/automatic taks on windows(shared)/linux(shared)/wamp servers.
the problem is that the project is running on multiple environments.
so what is the best way to set up cron/scheduled taks ?
Actually what i need to do is check email servers for new emails and if something found save it to the local DB. If you have any alternative other than cron job then please let me know.
Thanks.
EDIT:
As i mentioned in the question i have multiple emails/filtering , so i need to run something on background to fetch data periodically. CRON in linux and Scheduled Tasks on Windows.
But the real problem is that i am doing it on a shared hosting ( or it depends the client ) so i cannot use CRON/Scheduled Tasks.
Ex : Project is installed on GOdaddy windows shared hosting , it is a windows server so no support for CRON(normally) and they wont allow to use Scheduled Tasks.
the question is : is there any alternative for CRON/Scheduled Tasks
?
Short Answer
I don't know alternatives to CRON/ScheduledTask which would fulfill your need.
I suggest you outsource the schedule to another server, see my possibilities below.
I came up with the following possibilities:
Shared hosting with CRON jobs (easiest)
Look for a shared hosting provider who lets you add cron jobs (e.g. through webspace management). HostEurope (german) would be such a host.
You own a dedicated (virtual) server
Given you deploy this project to multiple shared servers but own a dedicated (virtual) server for yourself:
Make your script publicly available but guard it with a strong authorization mechansims. (hard-to-guess request token, white-list certain IPs as callees, ...).
Set up a cron job on your own server which calls the script on the client's webhost.
You don't own a server
As the last possibility but you don't own a dedicated server.
Setup a virtual machine at some cloud provider (e.g. OpenShift) and add the cronjob there. Don't use this instance for other jobs and it should fullfill your needs perfectly (Reference: https://openshift.redhat.com/community/blogs/getting-started-with-cron-jobs-on-openshift)
Requirements don't meet infrastructure (likely!)
Your client/project has requirements which don't fit a shared hosting environment. You are strongly encouraged to get a hosting plan fulfilling your true needs. Price differences between shared hosting and first virtual servers or dedicated hosting aren't so steep that investigating is out of question.
As everyone else suggested if this is a web page you can use wget to run it.
If it is a CLI script you will have to run it with php /directory/filepath.php
If the actual question is HOW you are going to periodically run it you will have to use
cron on *NIX and scheduled tasks on windows server.
If you want to automatically install the cron you will have to check the OS and act depending on whether the OS is windows or *NIX.
A google search will give you results on how to do it in both environments.
Edit after reds clarification
As Samuel Herzog pretty nicely says, on shared hosting you (usually) have a control panel.
Most well known for linux are:
Cpanel: http://www.siteground.com/tutorials/cpanel/cron_jobs.htm
Plesk: http://www.hosting.com/support/plesk/crontab
Webmin: Setting up a cron job with Webmin
And for windows I only have some familiatiry with plesk for which the procedure is the same as before.
If you dont have control panel but have shell access (linux) you could follow this tutorial.
If you dont have control panel but have remote desktop (windows) you could follow this tutorial.
If you dont have any of the above you should follow Samuel Herzog suggestion about a vm on a cloud provider or consider upgrading to a VPS or a dedicated server.