I have the situation of editing the php files in the "www" folder of a server from local machines. But in my condition, all the users across the sites, who shares the intranet, should be able to access the php files. Its more like I'm counting how many times users using an application which I developed. So, my idea is like creating a database in WAMP in the server and creating a php file which performs appending operation. So, if the user uses my application, I will make the php file I created to run in the server and that counts number of times my application used. I'm new to web design and I'm using Apache2.4 version. Please let me know if you have a better approach. Thanks in advance.
say it's filename.php
filename.php
$con=mysql_connect("hostname", "username", "password");
mysql_select_db("user_db");
$sql="update usage_count set count=count+1";
mysql_query($sql);
This code increases the hit-count(count as in database) each time the file is fetched from clint computer.
Related
After at least 10 hours of pouring over online resources, videos, and tutorials, I have two questions about connecting my android app with my mySQL database.
Saving files
1) All the tutorials save the php files in C/WAMP/www/hello_php - for example, and so when you go to localhost/hello_php everything works.
--Where do I store my php files if I don't want to use localhost? i.e I want to use my mySQL's IP address.
--For example, the guy from this video uses this:
HttpPost httppost = new HttpPost("http://192.168.168.0.3/~tahseen#amin/php/getAllCustomers.php");
--I presume the 192.168... is the IP of his server. Where did he save the "getAllCustomers.php" file?
--Note, I am using phpMyAdmin to handle the database.
Existing JDBC code
2) I have already created all the code required to insert/update/delete elements from my DB. I have done this in Java using JDBC in eclipse. My understanding is that connecting my android app to my DB with JDBC is not ideal / unsafe / not recommended.
--Is all the code I wrote useless? i.e do I have to convert it all to php code?
Thanks in advance for your help
The php file in your example is stored in the home folder of user 'tahseen#amin' in the subdirectory 'php'. You can put your php file anywhere on the server as long as it is accessible for HTTP requests. Usually you would put the files in a subdirectory of the root webfolder, which is usually in /var/www/ on the server.
As far as I know Android has no support for MySQL databases, so you have to do the queries via PHP (or another programming language, as long as it is accessible as a service on your server). You can then send HTTP requests from your Android application in order to perform the database modification via the PHP scripts on the server.
I'm helping a friend migrate her wordpress server to GoDaddy, and I think I may have bitten off more than I can chew... I've never migrated a wordpress before. This page here is the Wordpress wiki for moving Wordpress when your domain isn't changing. It doesn't seem to complex, but I'm terrified of accidentally ruining this website and I don't understand a couple of things on the wiki.
The Wiki says
If database and URL remains the same, you can move by just copying your files and database.
Does this mean that I can just log in to her server from Filezilla and copy all of the files on the server? What does database mean, is that something separate from the files on the server?
If database name or user changes, edit wp-config.php to have the correct values.
This sort of goes with my first question.. What initiates a database name or user change?
Apologies for my ignorance, but after an hour or so of searching around for these answers I'm left just as confused.
Last but not least, is there anything else I should be aware of when migrating a wordpress? I'm a little nervous..
You are going to need to migrate you instalation in two parts.
Part 1 you already eluded to. You will need to copy the files from one server to another. I am guessing you know how to do this so I will not dive any deeper into it. If you do need more explanation, please let me know and I will edit the question.
Part 2 is what you mentioned but said you did not understand. Copying the database of wp install. Wordpress runs off of PHP and MySQL. The "files" part in part 1 is the PHP files (along with some html and css). You need to log into his MySQL server and do an export of his database. You should be able to export the database (How to export mysql database to another computer?) and import it into his new server on GoDaddy. (Error importing SQL dump into MySQL: Unknown database / Can't create database).
Just take things slow, follow the guides that I have linked and do not delete anything from the first server until everything is working on the second. Please let me know if you do not understand anything.
if you don't feel confortable with database exports and imports, try using plugins like:
http://wordpress.org/plugins/duplicator/
or
http://wordpress.org/plugins/wordpress-move/
Check his docs for info.
Luck!
• A database is literally a data base. It's where websites (and other applications) store their data eg. For Wordpress, it would be data such as posts, user information etc.
If you are using a cPanel setup then you would need to get access to it and navigate to phpMyAdmin which is the GUI for managing a database.
Now I'm not sure what type of setup you're using but that should be a start.
• A database has a connection server address (usually localhost), a database name, username and password. These are setup at the time of setting up a database.
When migrating servers, you would need to update those details in the wp-config.php file (I think around line 19 or so).
• The annoying part about migrating Wordpress to another server is the domain change as you have to update the old domain with the new domain throughout the database. However since you're not changing domain names, it should be a smooth ride as long as the new server supports PHP and has a database.
I have a website right now that is currently utilizing 2 servers, a application server and a database server, however the load on the application server is increasing so we are going to add a second application server.
The problem I have is that the website has users upload files to the server. How do I get the uploaded files on both of the servers?
I do not want to store images directly in a database as our application is database intensive already.
Is there a way to sync the servers across each other or is there something else I can do?
Any help would be appreciated.
Thanks
EDIT: I am adding the following links for people that helped me understand this question more:
Synchronize Files on Multiple Servers
and
Keep Uploaded Files in Sync Across Multiple Servers - LAMP
For all Reading this post NFS seems to be the better of the 2.
NFS will keep files in sync but you could also use ftp to upload the files across all servers as well but NFS looks like the way to go.
This is a question for serverfault.
Anyway I think you should definitely consider getting in the "cloud".
Syncing uploads from one server to another is simply unreliable - you have no idea what kind of errors you can get and why you can get them. Also the syncing process will load both servers. For me the proper solution is going in the cloud.
Should you chose the syncing method you have a couple of solutions:
Use rsync to sync the files you need between the servers.
Use crontab to sync the files every X minutes/hours/days.
Copy the files upon some event (user login etc)
I got this answer from server fault:
The most appropriate course of action in a situation like this is to break the file share into a separate service of its own. Don't duplicate files if you have a network that can let the files be "everywhere (almost) at once." You can do this through NFS/CIFS or through a proper storage protocol like iSCSI. Mount as local storage in the appropriate directory. Depending on the performance of your network and your storage needs, this could add a couple of undetectable milliseconds to page load time.
So using NFS to share server files would work OR
as stated by #kgb you could specify one single server to hold all uploaded files and have other servers pull from that (just make sure you run a cron or something to back up the file)
Most sites solve this problem by using a 3rd party designated file server like Amazon S3 for the user uploads.
Another answer could be to use a piece of software called BTSync, it is very easy to install and use and could allow you to easily keep files in sync accross as many servers as you need to. It takes only 3 terminal commands to install and is very efficient.
Take a look here
and here
You can use db server for storage... Not in the db i mean, have a web server running there too. It is not going to increase cpu load much, but is going to require a better channel.
you could do it with rsync.. people have suggested using nfs.. but that way you create one point of failure... if the nfs server goes down.. both your servers are screwed... correct me if im wrong
I'm very, very new to PHP, so please bear with me. I'd like to configure my current website (whose files I've already converted to .php, and which I work on in Dreamweaver), so that every time I upload a new file to my server, the upload date will automatically be displayed in the article (WordPress and other CMSs do this, but I don't have the time to make a WordPress template similar to my current site layout).
The problem with the database is that it's hosted at 1and1, where remote access (i.e. via Dreamweaver) isn't permitted, and I have to use phpMyAdmin at my host's website, which is ridiculously slow. Whenever I want to, for example, create a table, I have to go through a ton of slow-loading pages.
Is there any way to automate this process, at least to some degree?
You could always build a simple PHP script which connects to your database and runs an SQL query. You could then just visit the page and the query would be executed.
Here's a tutorial on writing SQL and running it with PHP if you're not sure: http://www.tizag.com/mysqlTutorial/
PHPMyAdmin has a "SQL" tab, where you can enter text commands just as if you were typing them into your local mysql command line. If you're not sure where to find it, there's an image in this (otherwise irrelevant) tutorial. You can use this to create and modify tables, and also run queries of all kinds.
I'm attempting to build an application in PHP to help me configure new websites.
New sites will always be based on a specific "codebase", containing all necessary web files.
I want my PHP script to copy those web files from one domain's webspace to another domain's webspace.
When I click a button, an empty webspace is populated with files from another domain.
Both domains are on the same Linux/Apache server.
As an experiment, I tried using shell and exec commands in PHP to perform actions as "root".
(I know this can open major security holes, so it's not my ideal method.)
But I still had similar permission issues and couldn't get that method to work either.
But I'm running into permission/ownership issues when copying across domains.
Maybe a CGI script is a better idea, but I'm not sure how to approach it.
Any advice is appreciated.
Or, if you know of a better resource for this type of information, please point me toward it.
I'm sure this sort of "website setup" application has been built before.
Thanks!
i'm also doing something like this. Only difference is that i'm not making copies of the core files. the system has one core and only specific files are copied.
if you want to copy files then you have to take in consideration the following:
an easy (less secured way) is to use the same user for all websites
otherwise (in case you want to provide different accesses) - you must create a different owner for each website. you must set the owner/group for the copied files (this will be done by root).
for the new website setup:
either main domain will run as root, and then it will be able to execute a new website creation, or if you dont want your main domain to be root, you can do the following:
create a cronjob (or php script that runs in a loop under CLI), that will be executed by root. it will check some database record every 2 minutes for example, and you can add from your main domain a record with setup info for new hosted website (or just execute some script that gains root access and does it without cron).
the script that creates this can be done in php. it can be done in any language you wish, it doesn't really matter as long as it gets the correct access.
in my case i'm using the same user since they are all my websites. disadvantage is that OS won't create restrictions, my php code will (i'm losing the advantage of users/groups permissions between different websites).
notice that open_basedir can cause you some hassle, make sure you exclude correct paths (or disable it).
also, there are some minor differences between fastCGI and suPHP (i believe it won't cause you too much trouble).