My Article Database: Free Articles for Teaching and Studying English as a Foreign Language in China - by Paul Sparks




 Homepage
 About Me
 Teachers
 Students
 Lessons
 Photographs
 Links
 World News
 ICQ Chat
 Contact Me
 Articles
 
My Article Database:

 

Accounting
Acne
Adsense
Advertising
Aerobics
Affiliate
Alternative
Articles
Attraction
Auctions
Audio Streaming
Auto Care
Auto Parts
Auto Responder
Aviation
Babies Toddler
Baby
Bankruptcy
Bathroom
Beauty
Bedroom
Blogging
Body Building
Book Marketing
Book Review
Branding
Breast Cancer
Broadband Internet
Business
Business Loan
Business Plan
Cancer
Car Buying
Career
Car Insurance
Car Loan
Car Maintenance
Cars
Casino
Cell Phone
Chat
Christmas
Claims
Coaching
Coffee
College University
Computer Tips
Cooking
Cooking Tips
Copywriting
Cosmetics
Craft
Creative Writing
Credit
Credit Cards
Credit Repair
Currency Trading
Data Recovery
Dating
Debt Relief
Diabetics
Diet
Digital Camera
Diving
Divorce
Domain
Driving Tips
Ebay
Ebook
Ecommerce
Email Marketing
E Marketing
Essay
Ezine
Fashion
Finance
Fishing
Fitness
Flu
Furniture
Gambling
Golf
Google
GPS
Hair
Hair Loss
HDTV
Health Insurance
Heart Disease
Hobbies
Holiday
Home Business
Home Improvement
Home Organization
Interior Design
Internet Tips
Investment
Jewelry
Kitchen
Ladies Accessories
Lawyer
LCD / PLASMA
Legal
Life Insurance

Return to Articles about Spam Blocking

What is the Robot Text File?

by: alan murray


The robot text file is used to disallow specific or all search engine spiderís access to folders or pages that you don't want indexed.
Why would you want to do this?

You may have created a personnel page for company employees that you don't want listed. Some webmasters use it to exclude their guest book pages so to avoid people spamming. There are many different reasons to use the robots text file.
How do I use it?

You need to upload it to the root of your web site or it will not work - if you don't have access to the root then you will need to use a Meta tag to disallow access. You need to include both the user agent and a file or folder to disallow.
What does it look like?
It's really nothing more than a "Notepad" type .txt file named "robots.txt"
The basic syntax is
User-agent: spiders name here
Disallow:/ filename here
If you use
User-agent: *
The * acts as a wildcard and disallows all spiders. You may want to use this to stop search engines listing unfinished pages.
To disallow an entire directory use
Disallow:/mydirectory/
To disallow an individual file use
Disallow:/file.htm
You have to use a separate line for each disallow. You cannot you for example use
Disallow:/file1.htm,file2.html
You should use
Use-agent/*
Disallow:/file1.htm
Disallow:/file2.htm

For a list of spider names visit
http://www.robotstxt.org/wc/active/html/
Make sure you use the right syntax if you don't it will not work. You can check you syntax here http://www.searchengineworld.com/cgi-bin/robotcheck.cgi

For help on creating robot text files there is a program call robogen.
There is a free version and an advanced version, which costs $12.99 http://www.rietta.com/robogen/


About the author:
Alan Murray is a Certified Internet Webmaster Professional and Provides SEO Services and Website design.

http://www.designprofessional.co.uk/SEO-Services.htm


Circulated by Article Emporium

 

New! Watch Online Articles with YouTube for Free:

 

 

 

 

Click Here to Return to Top of Page