I have two websites on a hosted server with hostgator.
I have website1 which is a website I built to add products, images, and track data from all the website we run. In this website I store the images in the filesystem on the server spot dedicated to this website.
Then I have website2 which is a website for to display a web store we are operating and I have it connect to the database that stores the data from website1 and grabs all the relative paths to the images that are on website1 filesystem. I cannot however get the browser to find these images and display them. It keeps giving me errors image not found.
I'm new to filesystem management using php and its proving to be a little difficult.
I am using this to set the file path to the image on website2.
src='".$_SERVER['DOCUMENT_ROOT']."/website1/public_html/$productIMG'
is there something else I have to do to get it to grab the image and display it or is it even possible to try to do what I am trying to do. Maybe I am missusing the $_SERVER[DOCUMENT_ROOT] someone please help me.
The variable in the path is correct for the path I am trying find but the beginning part is not finding the correct way to the other website.
Here is an example of the link I am getting to an image.
A website cannot take the document root of another website and use it as its source. Website1 obviously has an address e.g www.website1.com. This is the public address that equates to its document root (public_html directory). In order to extract anything from there, you need to www.website1.com/somefolder/somefile.extension from Website2 or anywhere else. That won't let you down.
Related
I'd like to create a restricted admin area on my website where I can upload pictures to and save them into one of my website directories.
I don't want the uploaded pictures to be accessible directly using directory listing or using the full path, e.g. www.mywebsite.at/uploads/test.jpg.
I read about htaccess and it basically now works that I can't access these files directly from the browser, but how can I ensure that I can still access them using PHP? Whenever I have uploaded a picture I would like to use it at a specific time (e.g. daily at 7pm) to upload it to different websites like 500px.com using their API.
Thanks,
David
I want to get the notification if the company public web site add the document on there web sites. I need to do this for around 400 public sites. As every site will have different document directory, i will make the database for all sites directory information in Mysql at my local server.
Example1
http://www.hubpower.com
The documents directory is placed in on the following path
http://www.hubpower.com/wp-content/themes/hubco/pdf/
There are further two documents link in the above folder:
http://www.hubpower.com/wp-content/themes/hubco/pdf/3Q2K17%20Result.pdf
http://www.hubpower.com/wp-content/themes/hubco/pdf/1910-financial-results-announcements-(dec-2015).pdf
Example2
http://www.pk.abbott/investor/investor-information.html
There are two documents directories is placed in on the following path
http://dam.abbott.com/en-pk/documents/pdf/investors/
http://dam.abbott.com/en-pk/investor-relations/
Here are the documents links:
http://dam.abbott.com/en-pk/documents/pdf/investors/Q12017.pdf
http://dam.abbott.com/en-pk/investor-relations/2016Q3.pdf
http://dam.abbott.com/en-pk/investor-relations/Abbott_A_R_2016.pdf
http://dam.abbott.com/en-pk/investor-relations/AR2015.pdf
If website add any more pdf documents on above path, I would like to have notification on email plus further download the new documents from website directory to my local server.
Please advice some solution to achieve this goal. I prefer working with
Restful, PHP, Angularjs, Nodejs, python,Javacript
Thanks & regards
you can use any Website Content Changes tool like Chrome Plugin Visualping which can email you.
Form this Answer
In general, you will need to poll the website if there are no other possibilities like a news feed. You can't force them to provide such a service.
For Wikipedia in detail, there are live update IRC streams, one for each project. Wikistream is such an app that reads the feed, you can view it's open source node.js code at github.
You can use nodejs and the request module to get the html for every 400 pages you are mentioning, then you parse it with cheerio, then you use mysql module to put all the data into your database, you can also check if the data you scraped already exist in your database, if yes, you do nothing, you can do this like once a day
I have a folder full of images on my server where my mobile app accesses them.
www.mysite.com/images/image001.jpg
Whoever has this link now can access the files. Also can comprehend that the images are in a certain order and thus guess the pattern etc...
The image links are gotten via the php inside the app that use token to verify the user is legit and indeed the request is coming from a mobile that has downloaded the app.
What I want to do is to secure the folder from external access and prevent people from accessing the folder and seeing everything from a browser and limit its access only via the php file.
I have used the trick of .htaccess with deny from all so that it show the forbidden message whenever someone visits from the web, however, all my JSON requests also do not work now.
What can I do to accomplish this?
You will have to serve the images with a PHP script that also checks that access is permitted.
Once you've done this you can simply store the images outside the web root, which makes them inaccessible from the web, except through the PHP file that serves them.
best option is to make the pic links randomized and un-guessable
so a pic link would look like this:
www.mysite.com/images/8Md9FhD1hANdIBUz4WVCzKR227fykTByq6SKHas5FyYJDr2EjAlIn1bS0f5gPJih.jpg
youtube use this method for "private" videos
users / bots cant be accesses randomly, and you cant guess the next pic.
when the user is authenticated display the link. the worst thing that can happen is that this user can share that link, (he can download and share not matter what you do)
when you save the picture on your server just randomize the name.
I have a number of different domains where I would like users to be able to select a photo, however I want to photos uploaded/stored on one central separated domain.
Is there a more advisable way to do this?
I've considered using an iframe from the other domain (so they are interacting with the required domain) but haven't tried it yet.
I have also read that curl can potentially do this.
Any other ideas/problems/concerns...
All advise appreciated.
thx
There are a couple ways you can handle this. Here are the scenarios I see.
Scenario A:
All domains are on the same server. With this setup you can actually help your server security by storing the images and other files in a directory that is not accessible by Apache. Then you use PHP to serve the file to the end user through virtual links. Drupal does this with its private file system.
If the users are able to upload images from the other domains then just have all of the domains write the images to the same directory, but then always return the "central" domain's URL for retrieving them.
Scenario B:
The domains exist across two or more servers. With this setup what you really need to do is setup an API where the other domains communicate behind the scenes to send the photo to the core repository domain. This domain saves it and returns a URL for accessing it.
In either case you should really look up CDN technology. Its basically what your trying to accomplish plus a lot more. I would also recommend that in either scenario that you always use the path on the central domain for all the images instead of returning them back through the current domain.
No reason to get iframes or curl involved. Assuming that all these domains are on the same server (or same cluster of servers), there is nothing which requires that a file uploaded using a file on one domain must only be served from that domain. An upload form can do whatever it wants with the uploaded file, including storing it in a location where it'll be available from your central domain.
I am launching a web application soon that will be serving a fair amount of images so I'd like to have a main web server and a static content server and possibly a separate database server later on.
I'd like the user to:
login and be able to upload a photo
the photo is renamed a randrom string
the photo is processed into a thumbnail
the photo and thumbnail are stored into a filesystem on the static server.
the photo and thumbnail's directory and filename are stored in a mysql database
The problem is I don't know how to have the user instantly upload an image to a separate server.
I thought about using amazon s3, but you can't edit filenames before posting them. (through POST, I'd rather not use the REST api)
I could also use php's ftp function to upload to a separate server, but I'd like to dynamically create folders based on the properties of the image (so I don't have all the images in one big folder obviously), but I don't know how this would work if I used ftp...
Or I could save them locally and use a CDN, I'm not too familiar with CDN's so I don't know if using them this way would be appropriate or cost-effective.
What are my options here? I'd like the images to be available instantly (no cron jobs/queues)
Thanks.
You can create directories over FTP with PHP, so that should not be a showstopper.
I thought about using amazon s3, but you can't edit filenames before posting them. (through POST, I'd rather not use the REST api)
If you let your PHP server do the uploading to S3 via POST, you can name the files whatever you want. You should do that anyway, letting your users upload to S3 directly, without your PHP code inbetween, sounds like bad for security to me.