Bonjour,
Je souhaite interdire les aspirateurs et bots, j'ai donc ajouter ces lignes dans le robots.txt et fait un test avec intellitamper mais celui-ci arrive quand même à choper les pages, avez vous une astuce autre ?
Merci pour vos réponses
Je souhaite interdire les aspirateurs et bots, j'ai donc ajouter ces lignes dans le robots.txt et fait un test avec intellitamper mais celui-ci arrive quand même à choper les pages, avez vous une astuce autre ?
Code:
User-agent: Alexibot
User-agent: asterias
User-agent: BackDoorBot/1.0
User-agent: Black Hole
User-agent: BlowFish/1.0
User-agent: BotALot
User-agent: BuiltBotTough
User-agent: Bullseye/1.0
User-agent: BunnySlippers
User-agent: Cegbfeieh
User-agent: CheeseBot
User-agent: CherryPicker
User-agent: CherryPickerElite/1.0
User-agent: CherryPickerSE/1.0
User-agent: CopyRightCheck
User-agent: cosmos
User-agent: Crescent
User-agent: Crescent Internet ToolPak HTTP OLE Control v.1.0
User-agent: DISCo Pump 3.1
User-agent: DittoSpyder
User-agent: EmailCollector
User-agent: EmailSiphon
User-agent: EmailWolf
User-agent: EroCrawler
User-agent: ExtractorPro
User-agent: Foobot
User-agent: Harvest/1.5
User-agent: hloader
User-agent: httplib
User-agent: humanlinks
User-agent: InfoNaviRobot
User-agent: IntelliTamper
User-agent: JennyBot
User-agent: Kenjin Spider
User-agent: LexiBot
User-agent: libWeb/clsHTTP
User-agent: LinkextractorPro
User-agent: LinkScan/8.1a Unix
User-agent: LinkWalker
User-agent: lwp-trivial
User-agent: lwp-trivial/1.34
User-agent: Mata Hari
User-agent: Microsoft URL Control - 5.01.4511
User-agent: Microsoft URL Control - 6.00.8169
User-agent: MIIxpc
User-agent: MIIxpc/4.2
User-agent: Mister PiX
User-agent: moget
User-agent: moget/2.1
User-agent: NetAnts
User-agent: NetAttache
User-agent: NetAttache Light 1.1
User-agent: NetMechanic
User-agent: NICErsPRO
User-agent: Offline Explorer
User-agent: Openfind
User-agent: Openfind data gathere
User-agent: ProPowerBot/2.14
User-agent: ProWebWalker
User-agent: psbot
User-agent: QueryN Metasearch
User-agent: RepoMonkey
User-agent: RepoMonkey Bait & Tackle/v1.01
User-agent: RMA
User-agent: SiteSnagger
User-agent: SpankBot
User-agent: spanner
User-agent: SuperBot
User-agent: SuperBot/2.6
User-agent: suzuran
User-agent: Szukacz/1.4
User-agent: Teleport
User-agent: Telesoft
User-agent: The Intraformant
User-agent: TheNomad
User-agent: TightTwatBot
User-agent: Titan
User-agent: toCrawl/UrlDispatcher
User-agent: True_Robot
User-agent: True_Robot/1.0
User-agent: turingos
User-agent: URLy Warning
User-agent: VCI
User-agent: VCI WebViewer VCI WebViewer Win32
User-agent: Web Image Collector
User-agent: WebAuto
User-agent: WebBandit
User-agent: WebBandit/3.50
User-agent: WebCopier
User-agent: webcopy
User-agent: WebEnhancer
User-agent: WebmasterWorldForumBot
User-agent: webmirror
User-agent: WebReaper
User-agent: WebSauger
User-agent: website extractor
User-agent: Website Quester
User-agent: Webster Pro
User-agent: WebStripper
User-agent: WebStripper/2.02
User-agent: WebZip
User-agent: WebZip/4.0
User-agent: Wget
User-agent: Wget/1.5.3
User-agent: Wget/1.6
User-agent: WinHTTrack
User-agent: WWW-Collector-E
User-agent: Xenu's
User-agent: Xenu's Link Sleuth 1.1c
User-agent: Zeus
User-agent: Zeus 32297 Webster Pro V2.9 Win32
User-Agent: MJ12bot
User-agent: HTTrack
User-agent: HTTrack 3.0
User-agent: TurnitinBot
User-agent: QuepasaCreep
Disallow: /
User-Agent: *
Allow: /
Merci pour vos réponses