programming an e-book library - php

what would be the best way to code an e-book library in php? how should the e-books be displayed? should the contents be generated from the database or from sources like txt or pdf? altho i'd prefer the ebooks to be displayed in html format, not pdf. are there any good online tutorials on this?
i'd appreciate your thoughts.

I think you are running into problems because of your terminology, main e-book. For example, what you want to do won't work on my Kindle I expect, but do you want to do something similar to Google Books or Scribd?
Ideally you should store the content in the most flexible format you can, to have the maximum amount of information, as you can then dumb it down to the format you want to display it in.
So, you could store it as xml or LaTeX, and then if you want to convert it to pdfs, html or text you can pull arrange it as needed.
As to how to display it, that depends largely on how you want people to interact with your application.

Related

Add Secured Large Document in form of Word or PDF or Epub format on php website

We are planning to add some of the publications on our website Oshopindia.com.
We want to make these documents Online Readable Only.
We donot want anyone to copy/cut/screenshot or copy the publication in any manner. As the publication is copy righted by supplier, we cannot allow anyone to replicate or reproduce the same. How can we add large publication document online in form of PDF or Epub format or any other format and make it secure? Our Website is PHP based.
Once you put your data online it's open to be copied. The only way to make it so that people can't copy it is to not put it online. You can hinder them from copying it, but you can't stop them. Large online publishers can't even do this. Look at an Amazon book preview. They use images, which makes it harder to copy the text, but you can still copy the image.
Simply said you can not do this. If the content can be viewed people can make screenshots. If you want it to be secure don't publish it.
options:
Dont publish
Show an image so people can not copy text and use a good copyright

How to programmatically build an org-chart and export to a PowerPoint file?

Most of the question is in the title.
A client is asking me to build a form with dynamic hierarchical fields from which to build a sort of org-chart that must be exported to PPT (OpenOffice Impress is an option).
I've found many libraries that allows to do that, but all of them only allows to export to images, HTML5 or other non editable charts.
I'd really like to use Google Chart Tools (even though I'm afraid of Google tendency of discontinuing their products), but also this tool doesn't have any mean to export to PowerPoint.
Also PHPPowerPoint seems a possible solution, but there is no documentation as far as I can see and I don't know if the autoarrange requirement is available.
Being dynamically built, the library must also take care of automatically arranging all items (unless anyone knows a way to do that programmatically).
Does anyone ever had this need or know a possible approach?
The main point is that the generated file must be editable.

Can I crawl websites, download specific pages, and save rendered versions as PDFs in PHP?

I just need some clarity here on whether this concept is possible or whether i have misunderstood what is capable of crawlers.
Say 1 have a list of 100 websites/blogs and every day, my program ( i am assuming its a crawler thingy ) will crwal thru them and if there is a match for some specified phrases like "miami heat" or "lebron james", it will proceed to download that page -> convert it to a pdf with full text/images and save that pdf.
So my questions are;
This type of thing is possible right ? Pls note that i dont want just text snippets but i am hoping to get the entire page as if it was printed out on a piece of paper?
This type of programs are called as crawlers right ?
i am planning to build on code from http://phpcrawl.cuab.de/about.html
This is perfectly possible, as you are going to use phpcrawl to crawl the web pages use wkhtmltopdf to convert your html to pdf as it is
Yes it is possible, by using wkhtmltopdf tool you can convert web page as it is. its a desktop bases s/w so you can install in in you machine
Yes Crawlers.
Its a perfect tool for building what you want to build.
Yes it is possible.
You could call it a crawler or a scraper, as you are scraping data off the websites.
Rendering the website to a PDF will probably be the hardest part, their are webservices that can do this for you.
For Example
http://pdfmyurl.com/
(I have no affiliation, and I have never used them, it was just the first site in the google results when I checked)

Pdf book reader in php?

Actually I have to upload pdf files and need to read on my website as book reader like a presentation. Please show me the possible ways to achieve my goals.
Thank you
I've been using flexpaper, I use pdf2swf to convert the pdf to swf as I used the flash version but there is a javascript version too.
One possible solution would be to use scribd. You simply upload your document to their website and embed their reader on your website. This is the easiest way, and you get things like searchability. Their reader also works like Adobe's Acrobat Reader.
The downside is that you are uploading your documents onto a public website, so everyone will be able to view it. Perhaps they might have settings where you can lock your documents so that only certain people can see them.
The next solution is to roll your own. You can use turn.js. In this case, you will need to find a way to convert your PDF files to HTML files or perhaps image files. With images, your text won't be selectable, and they won't be discoverable by search engines. Again, converting PDF to HTML can also be difficult as you might lose formatting in the process.
But it is entirely up to your use case. Personally, I would go with scribd, as their platform works very well, and you won't have to worry about implementing your own system.

Copying specific data from a website to excel

A friend of mine has asked me to figure out a way of getting information from a website and putting it into an excel file.
This is the website in question: http://www.manta.com/world/North+America/Canada/Newfoundland/grocery_stores--B619B/#Location
He wishes to have an excel file with a list of all the names, addresses and phone numbers of all the results of his search.
So far I'm stumped in coming up with an idea. I'm fairly new to internet programming.
I was thinking that maybe I could create a greasemonkey userscript which would search for all the required data on the page and at the click of a button open a pop-up which would have the data in CSV format which could then be copied and pasted into excel. However the phone numbers aren't on the search results page so I don't think this is possible.
My second thought was to create a webpage that would search that site and get all the required data, then provide a "Download data to Excel" option.
Are these ideas possible and how would I best go about doing them? Is there a better way?
Thanks!
This would be easier to answer if we know what languages you're familiar with.
Assuming windows this can be done using jscript or vbscript on WSH using WinHttpRequest , excel may be accessed via ActiveX. If you need a UI i would suggest HTA.

Categories