Get url ? Read URL and use it in PHP file [closed] - php

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
Ok. I know that it is title is very confusing, but I do not know how to describe this..SOme tips ? :D
I want to create a page called , let's say -> category.php
On this page you see the parent categories. If a user clicks on this parent I need him go to category.php/parent_category.
On this page he or she sees the children of that specific parent - > And so on until the final page ( in my case a product ) is found.
I know this might sound simple, or maybe a dumb question but for me this is something I have never done before and after strolling on the internet en my LINDA courses I am still empty handed.
Hope someone can help me or lead the way a little bit.

As per OP's wish, comment to answer/answer: (slightly modified)
"On this page you see the parent categories. If a user clicks on this parent I need him go to category.php/parent_category."
You're looking for url rewriting:
http://www.sitepoint.com/guide-url-rewriting/
...and click on the links at the bottom of the page for the continuation.
I.e.: Go to page: 1 | 2 | 3 | 4
Example pulled from the said link:
If your application uses links like:
http://www.downloadsite.com?category=34769845698752354
then most of your visitors will find it difficult to get back to their favourite category (eg. Nettools/Messengers) every time they start from the main page of your site. Instead, they’d like to see URLs like this:
http://www.downloadsite.com/Nettools/Messengers
You can also consult the manual and extensive guide on Apache.org:
http://httpd.apache.org/docs/2.0/misc/rewriteguide.html
which is done through .htaccess
Plus, if I read it wrong, "pagination" with search capabilities.

I think what you usually do if you want make links that the user can save and use later, but not having to make specific pages for each of them would be to use get data with $_GET[]
That way you'll get links like "category.php?pc=this_category"
But if you specifically want links like category.php/parent_category you might need to rewrite it with .htaccess

Related

PHP - How to dinamically add a virtual route [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
Hi everybody and thanks in advance for your help!
I'm building a php website without using any framework. The site will be multilingual and the client want to add a virtual path for each country, for example: www.mysite.com/ar/ for Argentina, www.mysite.com/cl/ for Chile, and so on. Some languagues will be the same translation, but it has to have his own virtual path. The country will be selected by a select element in HTML, and it has to change the route dinamically.
I was surfing other solutions but they all talk about Rewrite rules in .htaccess, which it's a static solution, but I need something dynamics.
I hope I had explained myself, and you could help me.
Tks !!
/EDIT 06-02-17/
More details as suggested:
I have a table in MySQL with the country. Each record has a code, a name of the country and a virtualPath (is just a string, for example: Code > ARG, Name > Argentina, virtualPath > AR).
In the header of the page, i'm loading a country selector. The user select his country (for ex: Spain) and I want to reload the page, changing the URL to match the virtualPath of the country. For ex: www.mywebsite.com/ES/ or www.mywebsite.com/AR/ and so on.
Hope the detail clears things up! Thanks!!
Sorry for the short answer, I'm short of time but I hope I can point you in the right direction.
Look into REST API;
write yourself a PHP script that can evaluate the URL and then load the appropriate data according to the URL, get the appropriate data or path to data from a database which is updated by the user/client.
Good luck :)

How to avoid someone to fetch my database? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I have video website and I worry about if somebody write a script that fetches all my database and use it because i use the video id in address as query string for saying every page which video have to be shown.
Example:
http://example.com/video/215/videotitle
215 is my video id and videotitle is the title of my video, i want to have something like youtube:
www.youtube.com/watch?v=__zj6ibrq04
How can I do this?
i should mention that I used mod_rewrite to get an address like this so I absolutely worry about somebody fetching my database because they can know the video id.
That's awful because id is an auto increment primary key in my database!!!
Is there any suggestion?
If all your pages are public, i.e. anyone can go to any site at any time, there's only one thing you can do against a bot automatically scraping your site: detect scraping behaviour and throttle it. That means you need to track every visitor by IP address and/or other characteristics and start denying them access once they start requesting too many pages within a certain time window. No, this is not trivial to get right.
In this case it doesn't really matter what your URLs look like; you may think that YouTube's URLs are "unguessable", but (most of) YouTube's videos are entirely public and can be discovered by browsing through YouTube's front page. There's no need to guess the URL, that's an irrelevant detail. And even if they were not, you could simply start trying every single URL from __aaaaaaaa trough __9999999. It'll take a while, but it's possible.

automaticly using someones online search database [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
Does anyone have any ideas for what would be the best way to automaticly use someones online search database, given a static search (see example). It might also make this question more usefull to add a solution for a none static search.
So for example, I have a website and I wan't to create a link to the PDF file of the latest report by a certain person on this site: http://aris.empr.gov.bc.ca The search criteria does not change, all that changes is new results as the database is updated, so the search result is always http://aris.empr.gov.bc.ca/search.asp?mode=find Notice that not all entries have a report yet.
So far my idea is to use a php script to search through the source code of the completed search result page, search for the first instance of a .pdf string, and then extract the whole link (the page is orderd by date, so the first pdf file found would be the latest report that has a pdf file available.
The problems with this solutions:
1) it is very specific to my problem and only works for a static search result, and so is not a good Q&A
2) I am not sure if the completed search link researches everytime you follow it, or if it leads to an old result that could become out to of date
3) my solution is not sexy and is held together by duct tape, if you know what I mean.
Thanks,
-Adrian
In real terms you want to scrape the page(s).
You have 2 options in PHP:
1. Use CURL to fetch the page and USE PHP DOM parser to parse and extract the content from it.
2. You can use PHP Simple DOM Library, check here : http://simplehtmldom.sourceforge.net
It has ready made functions and you won't need to use CURL much here.
I hope you get an idea.
Try some code, show us here and we will guide more on this...

Website using just index.php and database [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 9 years ago.
Improve this question
I'm working on creating an image gallery with just the index.php
The users can switch between images by using "next" or "previous" buttons.
Is this healthy for the server ?
Does this method harm SEO as no html pages really exist except for a blank template. ( All the SEO related data are in the mysql DB)
Thanks in advance.
The practice of having a single page website will not by default harm your site's page rank, although it does add some issues.
Only one URL - usually sites have /about, /contact, /gallery, /we'r'awesome, etc. You get one. (You could use hash tags as a practical workaround, but this wont do much for SEO - if you designate each hash tag as unique content and it all points to one page of the same content, this can actually hurt you)
Best SEO practice is to have one set of content/keywords per page. With a one page sit, you only get one go at it.
Additionally, a single page may take longer to load (depending how you set it up), and be less intuitive to navigate (gain, depending how it's set up).
Check out this article for some more points.
Yes it can harm SEO, but a lot of things can so that should be the last thing you focus on since Google is all that matters now and it changes often.
You can fix that by adding "tutorial" and "about" pages and what not anyways.
Healthy for the server depends on your code and the server and the amount of visitors and how you store the images etc etc etc etc.

Adding HTML search terms to results page URL [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
Currently I have a search box on site.com/search.php. A user enters in a search term and is taken to site.com/index.php which displays the results for the search terms. Except, no matter what a user enters, the results page URL will always just be site.com/index.php - instead, I want it to read site.com/search=word1+word2 or something like that so that each page has a unique URL based on the search input.
I also want it to work in the reverse way, so that someone could, in theory, go straight to site.com/search=query and it would be the same thing as if they first went to the main search page, typed something in, and hit return.
I'm new to programming and recently started learning about the .htaccess file which I understand I'll need in order to do this. I am just at a total loss as to how to do what I want. I know I need some sort of URL re-writing but not much more than that.
The easiest approach here is to change your search form to use method="get" instead of method='post'.
This will create a URL like:
site.com/index.php?search=word1+word2...
Simply change the search script to use $_GET or $_REQUEST instead of $_POST and you'll be all set without any other significant changes. It will also let users directly run searches via the URL.
The only downside of this approach is that it does not create a beautiful URL (i.e. is still shows "index.php")... however, you can easily work around this by using .htaccess to rewrite the URL.

Categories