Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

whitedragon101

macrumors 65816
Original poster
Sep 11, 2008
1,336
334
I need to restrict pdf and video content on a website from being accessible to non users. For the pdf files I put them above the public html folder then used a php file to get the pdf using filegetcontents.

However the video side has been a nightmare. Because the videos could be long I would need a solution that allows the files to still stream to the html 5 player without waiting for the whole file (I’m not sure getting the whole file would even work as it would need to hold a huge video file in memory).

My only idea is place the videos in a folder under the public html folder and use file and folder names made of 64bit random characters so you couldn’t just logically guess your way through the directory structure. Then use robots.txt to request no crawling and htaccess to set no indexing. My question with this solution is can people just give my domain name to a crawler and get back the locations and filenames of the videos anyway?

Any help would be appreciated:)
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.