Jump to content


 


Register a free account to unlock additional features at BleepingComputer.com
Welcome to BleepingComputer, a free community where people like yourself come together to discuss and learn how to use their computers. Using the site is easy and fun. As a guest, you can browse and view the various discussions in the forums, but can not create a new topic or reply to an existing one unless you are logged in. Other benefits of registering an account are subscribing to topics and forums, creating a blog, and having no ads shown anywhere on the site.


Click here to Register a free account now! or read our Welcome Guide to learn how to use this site.

Photo

How do I prevent my web pages from being cached?


  • Please log in to reply
5 replies to this topic

#1 blub

blub

  • Members
  • 60 posts
  • OFFLINE
  •  
  • Local time:04:43 AM

Posted 22 April 2011 - 08:18 AM

I need to make a few temporary web pages on a free hosting site for the sole purpose of having my resume and cover letter reviewed. It would only be for a few days, then I would remove the content for privacy reasons. I want to make sure there are no traces left of the web pages on the internet. How would I go about making sure nothing gets cached or anything like that?

Sadly, I used to know how to do this but if you don't use it, you lose it.

BC AdBot (Login to Remove)

 


#2 groovicus

groovicus

  • Security Colleague
  • 9,963 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:Centerville, SD
  • Local time:03:43 AM

Posted 22 April 2011 - 10:54 AM

Add the following to your pages:
<meta name="ROBOTS" content="NOINDEX, NOFOLLOW">

Of course there is no guarantee that the spiders will actually honor it.

#3 blub

blub
  • Topic Starter

  • Members
  • 60 posts
  • OFFLINE
  •  
  • Local time:04:43 AM

Posted 22 April 2011 - 01:43 PM

Thanks.

Is there a way to prevent files from being cached? I have some in .wps and .rtf format.

#4 groovicus

groovicus

  • Security Colleague
  • 9,963 posts
  • OFFLINE
  •  
  • Gender:Male
  • Location:Centerville, SD
  • Local time:03:43 AM

Posted 22 April 2011 - 02:26 PM

Yes. Put them in a password protected directory.

If you know who needs to access the file, you could use something like Skydrive to protect access to them. You just need to hand out the password to whomever needs access.
http://skydrive.live.com

It is free, and may suit your needs better than creating a page and risking exposure. Like I said, spiders do not have to honor the no-index, no follow tag at all.

#5 blub

blub
  • Topic Starter

  • Members
  • 60 posts
  • OFFLINE
  •  
  • Local time:04:43 AM

Posted 22 April 2011 - 03:24 PM

I tried using .htaccess and .htpasswd to password protect the files, but I keep getting a 404 error. I guess I'll loook into SkyDrive. I just wish I knew why I keep getting those errors.

#6 cafejose

cafejose

  • Members
  • 956 posts
  • OFFLINE
  •  
  • Local time:02:43 AM

Posted 29 May 2011 - 12:17 AM

Add the following to your pages:

<meta name="ROBOTS" content="NOINDEX, NOFOLLOW">

Of course there is no guarantee that the spiders will actually honor it.


Where exactly should that tag be added?




0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users