So I am developing some software in php and mysql. The clients web application is hosted on their server but basically I want an easy way to turn their site off if they don't pay. I wanted to be able to host a say config file on my server that has maybe an array of data that says how long their subscription is good for, what they have access to etc. I obviously don't want to save this information directly on their server because they could manipulate the config to have whatever they want. Their are a number of other things I would be using this for but for the most part these are the most important parts. Also if there is a better way I am always open to any suggestions.
Thanks in advance!
Related
I'm making an web application, it's a kind of online shop using PHP, jQuery, AJAX and JavaScript.
I want to launch my site on only one PC on local host. How should I set my site so that it only runs on my single PC?
Even if anybody copy my code files and database files to his/her own PC it should not run on their PC. How to do this?
The one way I know is by using the IP address. but I not quite sure about this method works or not.
If someone gains access to your source code then there is nothing that you can do to stop them.
When hosting it on your own you can prevent external access but beyond that there is nothing you can do.
You can use an encoder script to encrypt your source code, and some of these come with an ability to lock down to MAC address. I think they are all commercial solutions, though; start with IonCube and SourceGuardian. Zend might have something as well.
I would imagine each of these solutions would have comprehensive tutorials on their respective sites. Your workflow is basically to check out a copy of your source code from version control, and encode that folder as part of your build process.
Technically, encrypted code can be reverse-engineered, since the encryption key is built into the code. However, it is a lot of work for someone to do so, and even if they decode it, they won't have your comments or your meaningful variable/method/class names.
Make sure no one gains access to that PC (where your application resides). Only in that case you can protect your application from being run by unauthorized person. Once you take this security measure then you can easily disable your application from being accessed from any other LAN computers by using your IP. This is how professional servers works so should you.
You can change the webserver binding to localhost 127.0.0.1 only.
Alternative way you can create a filter rule that the server only accepts remote from localhost/127.0.0.1
With apache you can do this by .htaccess or directory/server rules.
When you want to share that code, you need to encrypt it with zend-guard or equal tools. there is also some licence management inside it, where you can bind licences to machines.
I am making a web application and i want it to be secure, so i ll be using SSL and, will hash passwords. But my server is managed by a different company and it's a shared hosting server, they have direct access to database. I want to prevent any possible loss of sensitive information so i am thinking about encrypting all the data in the database.
Is this a good way to keep data secure?
are there any other ways to protect data in database?
I am using PHP, MYSQL, Apache, and Linux
please provide details. also if am thinking in the wrong direction pls tell that too.
Thanks in advance
This is not a big privacy issue
The internet is composed of some few websites / web applications using self hosted solutions with fully personal servers (owned and operated in their own NOC).
Everyone else is using some form or another of shared, virtualized, semi-private, semi-dedicated, collocated hosting. In every case the hosting company has full access to everything, they have physical access to the servers -- no amount of protection can help you there.
Shared hosting might be the easiest to access from the hosting company's perspective. But that's not relevant, their policies should prevent them from operating in bad faith because if they wouldn't it wouldn't really matter if it was the easiest or the hardest to access it would only matter how interesting the data you have is to them (or some random employee of theirs).
Finding a solution to the above non-issue
Some approaches might use:
Mounting an encrypted filesystem as a folder and setting up MySQL to use that folder to store its data;
MySQL encryption functions to encrypt the data in a particular cell or column;
a library on top of SQLite that had an encryption feature which would encrypt the entire database file;
On the other hand if your PHP files would be on the same server and the database decryption password would be stored inside your PHP files, any "intruder" could find it and use it if they wanted it.
You'd have to store the password on a different server or obtain it from the user in order to not have it present inside the local PHP files. This would obviously still be available at runtime; if the "intruder" is a programmer he will be able to retrieve it fairly easily.
I'm currently building a simple web application in PHP that other company's can use as one of their services. I want to host the application myself and not install it on one of their servers, but i do want the accessibility that that would offer. Example:
www.mywebapp.com is where i would host the web application.
www.company.com would be the domain name of the client.
webapp.company.com should redirect to www.mywebapp.com/?c=company. Upon navigation, webapp.company.com/view.php?v=test would also be redirected to www.mywebapp.com/view.php?c=company&v=test and so on upon further using the web app.
Can someone explain how i can achieve this and if this is the best option considering my requirements?
I recommend that you switch to implementing an API. That's how this problem is solved by many corporations. They simply have an API key that will let your server know what client they are and therefore what to serve them.
Resources on API's:
Google Tech Talk: http://www.youtube.com/watch?v=aAb7hSCtvGw [1:00:19 long]
http://blog.programmableweb.com/2011/01/06/from-the-trenches-web-api-design-best-practices/
Directory of some existing API's: http://www.programmableweb.com/apis/directory
I think your idea IS possible if both servers are set up correctly, but doesn't it feel wrong to you?
You would need to have an 'a' record for both domains pointing to the same server
http://corz.org/serv/tricks/htaccess2.php?page=all#section-rewrite_sub-domains
I have one site completely working for one client , now I have some more clients want same thing replicated for them , Is there any way , that I can use this site as base site , as I plan to access this site from there domain and providing database for each client.
I am using PHP and MYSQL.
Thanks for any support , I appreciate your point of view also in this process. do I have right approach
I have been told that there will be SEO issues if I use one site for multiple domain. I have no competent person available which can direct me on domain name linking. I have www.xyzuniversity.com and 85% data is fetching from database. now i have to create abcuniversity.com and I want that I just create new database and ready to use and I think I can make multiple sites like this , if I succeed
Thanks
You can point multiple domains to the site. You can get the domain name from the server vars ($_SERVER['HTTP_HOST']) and choose you database through that. Make sure you dont have any absolute links.
Put the shared code into a shared directory and give each domain it's own database configuration, so the database configuration is not shared.
Then start your application with the different configuration based on the domain, e.g. by server environment variables like the hostname.
If your design does not support a configuration that can be injected into the application, you need to maintain two code-bases, one for each domain, e.g. in source-code control with one branch per domain.
However I suggest that your code is that modular that it supports configuration for the parts that need configuration according to your needs. If it's not yet, think about coming closer to it and make the changes.
I could use some advice.
I'm building a website in which the general user needs to be able to transfer files to the site administrator. It could be done one of two ways:
1] Some kind of web based interface - PHP perhaps - to send files to the FTP. I've done some Googling but have yet to come up with anything concrete that works. I've considered using an Applet but I need something free. Also, it seems to me that people are hesitant to give Applets permission to run in their browser.
2] Some kind of file transfer service. I've looked at services like Megaupload but using a free account, the files are public and that will not work. I need something that a user could use to send a file to the administrator who could pick it up later.
If anyone has some suggestions, it would be appreciated.
Thanks in advance.
You can do file upload via http if you have enough space (which I assume since you also have access to a ftp server). See here for more information.