I want to implement a faceted search for a project of mine. I'm using PHP5, Mysql and Symfony 1.4. Apparently the commnunity points to Apache Solr which seems to do exactly what I want to accomplish.
The problem is that the website is going to be live on a hosting provider that doesn't allow me to setup Solr (it is a shared hosting environment and neither allows Tomcat nor Solr to be up-and-running).
So could you please give me directions on possible alternatives or if there is a way to setup Solr in such an environment?
EDIT
My hosting provider neither supports SOLR nor solutions as opensolr. In general I can't use my environment to connect to a process on the same server or a remote one. It seems the only available option is to use Zend_Search_Lucene. So does this support faceted searching? Or if you have another option in mind please share it! I feel being in the middle of nowhere!
EDIT 2
As this question is opened for about a week from the answers given so far I am surprised (and disappointed) that there is no library (not service) available in PHP to implement faceted search. It seems that either this needs to be implemented manually or use solutions provided below
Change hosts, or host the Solr index elsewhere - for example, a quick search revealed http://www.opensolr.com/ provide Solr hosting, there are no doubt many others.
Performance won't be great and don't discuss scaling, but you can always create a reverse HTTP tunnelling over HTTP. Basically, instead of the web server opening an outbound connection to the Solr server, it's the Solr server connecting to the web server to request jobs and to post job results.
What you'll need to do:
The browser post a search query, the query is simply queued in the database.
The reverse proxy periodically connects to the web server (over plain ol' port 80) to fetch a list of queries from the job queue, pass the queries to the Solr server, and POST the results back to the web server.
The browser periodically polls the web server for finished search result.
Bonus marks: if your server allows concurrent request processing, use long polling to improve latency.
In short, bite the bullet and move to a decent host.
Try to avoid Zend_Search_Lucene, it's not really fast. (Well it's pretty good given that it's implemented in Php and doesn't run as daemon)
Hosted Solr as Paul suggested sounds like a good alternative - if you are not willing to change host.
Related
I'm doing a group project and we're creating an online game. We're about half way done and now it's time to implement a database to store our records/data and make the website go live on the internet.
I'm just confused on how PSQL works exactly. My understanding is that PSQL needs to be running on some server in order to access it. For previous assignments, I downloaded Postgres for my Mac and ran it on localhost. The PHP code was something along the lines of:
$dbconn = pg_connect("host=localhost port=5432 dbname=mydbname");
So, if we intend to use PSQL, where would the server be? Do one of us have to host the server? Can we use some sort of free online server? How do we connect to that server with PHP?
In summary, I have two main questions:
How do we make our code go live on the internet for free? (It's just a temporary website and will only be up for a few weeks at most)
How can we all access a shared PSQL database?
Sorry for the noob questions, I just got started with web development and am still learning.
So, if we intend to use PSQL, where would the server be? Do one of us have to host the server? Can we use some sort of free online server? How do we connect to that server with PHP?
PostGreSQL is going to have to run on some machine visible to anyone who needs to access it. If only your web server (i.e., the machine running PHP and your website) needs to talk to the PGSQL, then PGSQL can be installed on your web server. This is a very common configuration.
The server might also run on the LAN where your web server is running or it might be running on an entirely different network on a different continent. The most important thing is that any machine which must connect directly to the database can actually connect to it. If you're building a website, this means you have a web server. Your web server will need to connect to the PGSQL server. The second most important thing is that your web server and the PGSQL server should share a very fast connection for the sake of performance and efficiency.
It's probably most common for your web server to also host the database. On an ubuntu machine, installing a PostGreSQL server is as easy as running a few commands. A quick search yields many examples like this one.
How do we make our code go live on the internet for free? (It's just a temporary website and will only be up for a few weeks at most)
I don't know anyone who is in the habit of offering free web hosting or DBMS services. You could ask a friend. Or put an ad on craigslist or something. Or if you are tech-savvy (it doesn't sound like you are) then you could configure a high-end router at your home to use Dynamic DNS to point some domain at a machine running at your house.
How can we all access a shared PSQL database?
I have no experience with Heroku, but you might sniff around there. PostGreSQL's website also maintains a list of hosting companies. Amazon offers RDS instances running PGSQL. Digital Ocean has a variety of tutorials and how-tos on dealing with PostGres. You could probably fire up a 'droplet' server for super cheap and install it yourself without too much effort.
Amazon offer a free tier database solution for Postgres. Something like 300 hours (don't quote me on it) for a low level set up.
They have tutorials on doing this here:
https://aws.amazon.com/rds/?nc2=h_m1
Once set up you get the end point and your connection string becomes something like
db_connect ("host=[URLENDPOING] user=postgres dbname=postres")
Let me start off by saying that I know this is not the preferred way to run python, but I have had this website for several years and am looking to add additional functionality. If I try to move the site to a new host and server setup, I am afraid I will mess everything up.
I am using Godaddy shared server for my website, and I access it using cpanel. The website is a Wordpress blog but also has a few tools I built using PHP and SQL database to store the output. I want to create a chatbot using Python but from what I understand, I can't use Django on a shared Godaddy server.
Is there a way for me to run Python scripts given my limitations?
Is the best alternative for me to start a second server and build an API to process the conversation and send it back to my current website?
Shared hosting solutions tend to limit the software that can run on then. The last time I used GoDaddy, they had only a php stack, so probably no, you won't be able to use Python there.
But that's fine, you shouldn't!
If you plan on using Python, I recommend you to get a Vps, or switch to a cloud service, like Openshift.
You can find cheap and reliable Vps servers nowadays, so go for it.
So I just watched this little tutorial on WebSockets and it makes sense, but for it to work, the websockets server file has to always be running, then users connect and messages are delivered to each other. However, I am confused on how this would be done on a website hosted with some hosting company, like Bluehost. As far as I know, you can't have a file always running on Bluehost, so how would this be accomplished? Or instead of having a file always running, for something like a chat where messages were saved into a database, would it be better to use something like long polling? Thanks!
Your observation is correct that the server must be running continuously to support live webSocket connections.
As such, you have to select a hosting company that allows and supports that particular configuration. Many of the cheapest shared hosting situations do not support that because their economics is based on the fact that your server isn't going to be running most of the time.
Here are some other answers on the topic:
PHP Websocket on Bluehost
php script Bluehost Websocket server
I don't know specifically about Bluehost, but some other similar companies require you to have a VPS (virtual private server) before you can run a the continuous server process needed to support long lived webSocket connections.
I am starting my Azure Project. I did a website to Run Wordpress. Migrating from a previous host entity.
I use mysql, the ClearDb azure has is very limited and expensive $75 p month just because i need views and triggers ( only available on the dedicated server plan) its SAD. I Then created my MySql in Amazon RDS.
I did some latency tests, and the fact that Mysql being Closer to the Website region helps alot to reduce latency.
But still, my wordpress navigation seems slow! The first time is terrible, around 4 seconds. The next ones a bit better but nothing compared to my previous host of around 200ms!
Is this because of mysql being accessed by remote and not being in the same "data center" ? or is this because another thing like the "hot and cold" websites concept of Azure!? ( because even secondary calls are slow)...
I am starting to realize Azure is not Good for Websites with PHP + MYSQL.
I like the look and feel of Azure UI and Website concept but this is Very disappointing from azure not having a self MYSQL Structure.
Latency is a problem. Before assume that Azure is good or not, try a new thing. Create an azure virtual machine and install mysql in there. Then, edit your connection string to point to your VM. Also, install a plugin cache because wordpress execute a lot of queries in your database.
Another option: did you try the Scalable WordPress available through Azure Marketplace?
http://azure.microsoft.com/en-us/marketplace/partners/wordpress/scalablewordpress/
#Miguel, I agree with Thiago.
I think you can install MYSQL database on VM and connect it with your Wordpress website.
In this scenarios, I suggest you can use some tool to find the items which takes more times, such as webpagetest . It is useful for further steps.
At the same time, you can try to optimize your database,compress Images and so on. I recommend you can refer to this blog about how to improve the WP site performances.
By the way, Azure will unload your site if it is idle for the standard 20 minute timeout, which can cause slow responses for the initial site users after it is unloaded. You could enable the "always on" feature if you think it is necessary.
Did you try using persistent connections from your WordPress site to MySQL database. That can decrease latency times by decreasing time to create a connection and instead reuse existing connections.
The MySQL latency is an issue and there is probably not much we can do about it since the DB call has to be made further than to the localhost - simple as that.
1) If you need Azure for your MySQL driven eapp, use Microsoft's VM which is running quite reliably and where you can install PHP/MySQL on the same machine - speedy but relatively pricey. So far the only production acceptable solution for MySQL I found.
2) Don't use Azure if you need fast MySQL connections. We have tons of MS apps running in azure with various versions of MSSQL - and they all running relatively fine. So far I did not find fast solution for MySQL. Therefore, I am still using Linode for the PHP/MySQL apps. Their speed and service is superb (I am not getting anything from Linode for this recommendation :) and it is quite inexpensive in comparison to Azure in this case.
I'm currently developing a PHP application that is going to use websockets for client-server communication. I've heard numerous times that PHP shouldn't be used for server applications because of the lack of threading mechanisms, its memory-management (cyclic references) or the unhandy socket library.
So far, everything is working quite well. I'm using phpws as the websocket library and the Doctrine DBAL to access different database systems; PHP is version 5.3.8 . The server should serve a maximum of 30 clients. Yet especially in the last days I've read several articles stating the ineffectiveness of PHP for long running applications.
Now I'm not aware whether I should continue using websockets with PHP or rebuild the entire serverside application. I've tried Python with Socket.IO, though I did not get the results I expected.
I guess I have the following options:
Keep everything as it is.
Make the application use Ajax in combination with Socket.IO - e.g. run a serverside script that invokes the client's ajax calls when data is submitted to the server.
The last point sounds quite interesting, though it would require some work .. Would it be a problem for servers to execute all the clients requests at one time?
What would you recommend? Is the problem with PHP's memory management (I'm using gc_collect each time a client sends data to the server) still valid? Are there other reasons beside the obvious reasons (no threading, ...) for not using PHP as a server?
You can try running your socket.io on a node server on another port on your server (that is if you are not using a hosting plan like goDaddy).
I am using it and the performances are really satisfying.
I have an apache server on the port 80 serving my php files, and my server-client communications are done using a Node.js server running socket.io on the port 8080 (dev) or 843 (prod).
Node.js is really light and has great performance, but you need to run it as a server. Nodejitsu.com is a hosting solution that has the websocket protocol available and is on beta, so it is still free for now. Just note that you need to listen on the port 80 with socket.io, this is a limitation from theyr network.
If you want your pages all to be accessed on the port 80 then you will need a reverse proxy like varnish .
I hope that helps! Have a nice day.
Are there other reasons beside the obvious reasons (no threading, ...)
for not using PHP as a server?
Yep, lots of socketfunctions are incompatible with each other and it's a hell to debug.
i tried something similar myself and quit frustrated sind every function i thought would make sense didnt do what i expected