How to avoid someone to fetch my database? [closed] - php

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I have video website and I worry about if somebody write a script that fetches all my database and use it because i use the video id in address as query string for saying every page which video have to be shown.
Example:
http://example.com/video/215/videotitle
215 is my video id and videotitle is the title of my video, i want to have something like youtube:
www.youtube.com/watch?v=__zj6ibrq04
How can I do this?
i should mention that I used mod_rewrite to get an address like this so I absolutely worry about somebody fetching my database because they can know the video id.
That's awful because id is an auto increment primary key in my database!!!
Is there any suggestion?

If all your pages are public, i.e. anyone can go to any site at any time, there's only one thing you can do against a bot automatically scraping your site: detect scraping behaviour and throttle it. That means you need to track every visitor by IP address and/or other characteristics and start denying them access once they start requesting too many pages within a certain time window. No, this is not trivial to get right.
In this case it doesn't really matter what your URLs look like; you may think that YouTube's URLs are "unguessable", but (most of) YouTube's videos are entirely public and can be discovered by browsing through YouTube's front page. There's no need to guess the URL, that's an irrelevant detail. And even if they were not, you could simply start trying every single URL from __aaaaaaaa trough __9999999. It'll take a while, but it's possible.

Related

Hide dynamic page "movie.php?id=10" from my PHP website's URLs and replace spaces with hyphens [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
My website's URLs are as:
"/movie.php?id=10 Rangoon (2017)"
Spaces are converted to %20 by web browser.
But I want the simple URL like this:
"/Rangoon-(2017)"
How Can I do that?
My website is developed in PHP and all data of my website is placed on MySql Database and "movie.php" page retrieves the data from MySql Database.
Well, that's not a simple one, you'll have to make many updates :
generate an url for each of you movie pages (By the way, don't use parenthesis in an url, in your example the url should be something like rangoon-2017). You may use my method
seoUrl
store this url somewhere in your database
Decide on an url scheme, like movie/title-of-your-movie
When displaying a link, display the url scheme
create an .htaccess file which rewrites movie/whatever to movie.php?url=whatever
On movie.php, retrieve your movie information from the url and not the id
Additional thoughts :
you'll have to make sure each movie has an unique url. Adding the year is a good idea, since I doubt two movies will have the exact same title and year.
also, make sure your urls are unique in the database (And indexed, since you'll search from them)
why the prefix movie/ on the url ? Simply because if you don't create it, you'll find more difficult to redirect other type of pages to the correct file.

checking entire database during new post [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 8 years ago.
Improve this question
i have customized cms built in PHP. my website have more than 2,00,000 posts. when i publish a new posts, it take too many time to publish a single posts.
When i publish new post, it is checking entire data base to find if the url is exist in the database. if it exist then it shows an error.
I want to know, is this necessary to check entire database to check, if the url exist already.
Because it is checking entire database before publishing, it takes lot of time and server goes down.
Instead of checking the database to see if the URL is unique, simply force it to be unique by appending the ID of the new record to the URL string that you are saving and configure your website to work like that. For example, instead of you URL looking like:
mywebsite.com/post/test-post
Use something like
mywebsite.com/post/test-post/100
Where the 100 value is the ID of the post, which is a unique Primary Key
If you insist on using a URL that is generated from the subject line of a post (which I suppose is what you're doing here, your question is a bit vague), then create a table that contains hashes of every URL and search for the hash. That's a quick binary search. You'll need to watch out for hash collisions, so you should store the string in full as well.

Filtering pornographic and forbidden content with PHP? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I've have build a simple social network, where users are able to register/login and post four types of data:
Text, Pictures, Videos and Files.
Everything is stored in the database with the TEXT-datatype. When they upload a pic/vid/file, the text is just the path to the related data on my server.
Now what I am really interested in, is how to filter adult/forbidden content like videos or pictures that are showing pornographic or violent data. Also it could be possible to upload movies separated in smaller files, as my upload limit is 100 MB for each file.
I guess YouTube filters movies with a comparison to an existing database. So they just have to compare chunks of data with the original file et voila, they filtered it. Therefore one can easily mirror the movie, and it will take much longer for YouTube to keep that file.
However, how is the pornographic/religious/violent data filtered?
I also have no idea for what keyword I have to search for. That also could help me.
Use a report button and if something is reported and is indeed considered "dirty", delete it and keep its hash in a database table, and then when a file gets uploaded check its hash against that table, so that they can't just reupload the same "dirty" content.
To not be in legal trouble if your users stumble upon dirty content, make sure you clearly point out in your terms of service that your site hosts user-generated content and you can't guarantee that all content is safe, so by using the website users agree to these terms. Get in touch with your lawyers as the laws in each country may vary.

Website using just index.php and database [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 9 years ago.
Improve this question
I'm working on creating an image gallery with just the index.php
The users can switch between images by using "next" or "previous" buttons.
Is this healthy for the server ?
Does this method harm SEO as no html pages really exist except for a blank template. ( All the SEO related data are in the mysql DB)
Thanks in advance.
The practice of having a single page website will not by default harm your site's page rank, although it does add some issues.
Only one URL - usually sites have /about, /contact, /gallery, /we'r'awesome, etc. You get one. (You could use hash tags as a practical workaround, but this wont do much for SEO - if you designate each hash tag as unique content and it all points to one page of the same content, this can actually hurt you)
Best SEO practice is to have one set of content/keywords per page. With a one page sit, you only get one go at it.
Additionally, a single page may take longer to load (depending how you set it up), and be less intuitive to navigate (gain, depending how it's set up).
Check out this article for some more points.
Yes it can harm SEO, but a lot of things can so that should be the last thing you focus on since Google is all that matters now and it changes often.
You can fix that by adding "tutorial" and "about" pages and what not anyways.
Healthy for the server depends on your code and the server and the amount of visitors and how you store the images etc etc etc etc.

How to make a bot to navigate a site? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
Given a product id, associates have to navigate a vendors website, log in, perform a search, in order to get details on a product for a customer.
My employers want a program that can use the product id, and navigate the vendors website, and perform the search and everything to get the information thus saving the associate from having to manually repeat this task every time a customer wants more information about a product.
I know many sites use methods to prevent (CAPTCHA) exactly what I am trying to do. So I do not know if that automatically makes my given project an "evil" one. But I certainly do not have evil intentions, my employers simply want to save associates time on getting information that they are going to get regardless. However, if this is "evil" please explain why, so I can explain to my employers why we should not go down this road. That being said...
How can I make something like this in PHP?
It depends on what site you are trying to access. Many sites have an API that can be used to access data. If that's not the case, you may need to write a program that loads the html using a GET request, parses through the response, and retrieves the information you want. Without more details, that's the best answer I can give.
To start with I'd recommend reading up on cURL and DOM
cURL: http://php.net/curl (for fetching pages, even simulating search form)
DOM: http://www.php.net/manual/en/book.dom.php (to parse the fetched pages)

Categories