-
Posts
1576 -
Joined
-
Last visited
-
Days Won
9
Posts posted by Gonzalez
-
-
For sale today is a fantastic freshly made FaceBook Invites, Live Traffic Selling, YouTube Views & Back Links reseller website to needy webmasters who are looking for Paid traffic, Back Links, More Twitter Followers to their accounts and FaceBook Fans/Likes via Invites to increase their web presence, click rate, possible leads and good alexa and search engine rankings. Premium business website included with nice presentable looks.
Including great, brand able domain - TrafficXL.net
Please note: this is NOT like 99% of other start-up sites on the internet where the site has been put together in 10 minutes with poor quality content – this is quality!
Q. How much can I make?
Did I told you its really a nice and working way to make some extra money with very little effort. It can make an easy $500 per week with just a small bit of promotion. This business model is in existence since last decade and still in practice because each day several new web entrepreneurs enter this competitive market with a thought to get their web presence but do you think all of them are able to retain their existence? NO and this is the market we will be going to hit and capture with our business model via this nicely made professional website.
The business model i am selling is quite simple to understand. basically we will be selling traffic and backlink packages to needy website owners via our this website and we will be getting our cut from all the packages sold:
FaceBook Invites:-
1. 1000 Invites
2. 2000 Invites
3. 5000 Invites
4. 10000 Invites
Back Link Packs :-
1. 5 Permanent Back Links
2. 10 Permanent Back Links
3. 50 Permanent Back Links
4. 100 Permanent Back Links
Live Traffic Packs:-
1. 10,000 visitors
2. 25,000 visitors
3. 50,000 visitors
4. 100,000 visitors
YouTube Channel and Video views:-
1. 5,000 Views
2. 10,000 Views
3. 25,000 Views
4. 50,000 Views
Q. What do I get with the package:
* The whole website : TrafficXL.net
* Domain name : TrafficXL.net
* Content : Entire website content of TrafficXL.net
* BONUS: Ways to promote the website will be told to auction winner.
Q. So how much is this going to cost me?
Bids start at JUST $79
I am a web developer and its my job to create and sell nice sites with huge potential and same applies with this site too. This is the way i earn my living. Creating this website yourself would obviously cost A LOT of money – but my loss is your gain.
Q. What technical skills do I need? I’ve never run a website before.
If you have an internet connection and can speak basic English – then you have all the skills required! It really is that easy!
I will provide support for up to [10 days] – so if you get stuck you can always get hold of my via [chat, email,]
Google Adsense: though its not necessary but if you want, i can place your adsense ads on the site too.
You may ask from where I will get the Traffic, YouTube, BackLinks and FaceBook Invites which i will be selling on my website but lemme tell you that you don't ever have to generate the traffic you sell, when orders come in, you simply forward the orders to the other cheaper traffic suppliers and purchase the traffic and youtube views from them for about half the price, they do the rest which also include delivering the traffic to the website whose owner has placed the order with you, Which means that you will be earning a handsome commission from outsourcing your entire work and you can concentrate on your other projects basically making this website almost autopilot (just need some minutes to place cheap order on other website) and each sale will make you profit ranging from 40% - 50% of the placed value. There are different packages and in case you wish to make changes to the prices or visitor quantity, you can easily do it or i can assist you in doing this.
Q. From where will i get those cheap outsourcers to all these 4 different services?
Glad you asked or think about this question. You don't have to worry here too when I am here to sort all queries. I will be providing 3 - 5 different traffic, back links, twitter followers and facebook invites suppliers to the winning bidder, so he will always have a backup source of wholesale traffic and youtube, Back Links and facebook sites/persons.
Every website needs traffic and other services we sell on this site, new websites are being created everyday, there is no shortage of customers for the services TrafficXL.net offers. You can't go wrong with this website.
Also the site can also have Google Adsense to make some extra profit from ads.
Q. I have a full-time job, can I run this in my spare time?
The website basically runs itself. When visitors click on ads or buy products, you get paid! You can leave the site as it is or add more content if you desire.
The website doesn't require any maintenance or updating, ever. All payments will go to your PayPal account, then you just forward the orders to the suppliers. Very simple and good business.
BID: $79
BIN: $150
-
Google domination! GET 2000+ Backlinks for as low as $10
I won’t tire you with sales pitch so let’s get straight to the business as we’re all want an increase in SERPs and know that backlinks work.
2,000 Verified backlinks - $10.00
5,000 Verified backlinks - $20.00
10,000 Verified backlinks- $25.00
20,000 Verified backlinks- $50.00
50,000 Verified backlinks - $90.00
100,000 Verified backlinks - $250.00
300,000 Verified backlinks - $500.00
*NEW* PR4+ VERIFIED BACKLINKS PACKAGES
200 PR4+ Verified backlinks – $15
500 PR4+ Verified backlinks – $30
1000 PR4+ Verified backlinks – $50
You can point them to:
your websites directly
Web 2.0 properties and linkwheels
Ezine articles
social bookmarks
high PR forum profiles
Doing this will allow you to get link juice from thousands of backlinks to your money site using higher quality backlinks as a buffer!
Why Order From Me?
Incredibly Low Prices
Fast turnaround
Unlimited amount of anchors and URLs (read more on that below)*
Special characters
Special characters are all supported, however I want to be notified in advance of its presence because some characters like “ø” can be easily missed.
Unlimited URLs and anchors explained
*You can also provide as many URLs and anchors as you want and the number of anchors and URLs does not have to be equal.
However, I will not specify certain keywords to specific URLs, they must all work together.
Example:
If you have 10 keywords for three URL’s, all 10 keywords must be suitable as anchor text for all three URL’s provided.
Refund Policy: If you do not get your report within 5 days after you have ordered, you will get a full refund no questions asked.
Payment: Paypal
-
Small Package: $39.95
Plan Includes:
150 .EDU & .GOV Backlinks
200 Social Bookmarks
200 Profile Backlinks
5 Linkwheels or Link Pyramids
Medium Package: $69.95
Plan Includes:
250 .EDU & .GOV Backlinks
350 Social Bookmarks
400 Profile Backlinks
10 Linkwheels or Link Pyramids
150 PR 2 - 7 Backlinks
Complete Package: $99.95
Plan Includes:
400 .EDU & .GOV Backlinks
500 Social Bookmarks
600 Profile Backlinks
20 Linkwheels or Link Pyramids
200 PR 2 - 7 Backlinks
150 PR 8 & 9 Backlinks
Paypal only
-
Se poate face exceptie, dar site-ul sa aibe malware ascuns si nu la vedere si sa nu fie mult bine-nteles sa bata la ochi.
-Gonzalez
-
Gonzalez este al tau siteul?si daca cumpar de pe siteul acesta si siteul unde trimit vizitatorii contine malware din intamplare imi opresc aia traficul?
Ce site cu malware? e ascuns?
-Gonzalez
-
Nu se poate michee
-Gonzalez
-
Nu se poate michee
-Gonzalez
-
Gonzalez este al tau siteul?si daca cumpar de pe siteul acesta si siteul unde trimit vizitatorii contine malware din intamplare imi opresc aia traficul?
Este al meu, nu acept site-uri cu malware. Doar site-uri normale, warez, casino, si adult.
-Gonzalez
-
TrafficBirds.com
Vizitatori reali, nu boti!
Traffic targeted din:
Africa, Asia, Australia, Belgium, Brazil, Brunei, Canada, China, Denmark, Estonia, European Union, France, Germany, Greece, India, Indonesia, Italy, Japan, Kuwait, Malaysia, Mexico, Middle Eastern, Netherlands, New Zealand, North America, Norway, Panama, Peru, Philippines, Poland, Portugal, Puerto Rico, Romania, Singapore, South America, Spain, Sweden, Thailand, UK, United States, USA – Central, USA – Eastern, USA – Mountain, USA – Pacific, Vietnam, Worldwide
Targeted traffic (inclus si warez)
10,000 Visitors
Price: $39,95
25,000 Visitors
Price: $99,95
50,000 Visitors
Price: $179,95
100,000 Visitors
Price: $299,95
Casino traffic
10,000 Visitors
Price: $39,95
25,000 Visitors
Price: $110,95
50,000 Visitors
Price: $210,95
100,000 Visitors
Price: $399,95
Adult traffic
10,000 Visitors
Price: $39,95
25,000 Visitors
Price: $110,95
50,000 Visitors
Price: $210,95
100,000 Visitors
Price: $399,95
Plata se face pe site:
Getting you traffic | TrafficBirds.com
Sau aici, imi trimite-ti PM.
Metoda plata: Paypal
-
Site-ul este nou si cu potential mare de a va face profit instant.
Trafficbirds.com ofera pachete de traffic:
10,000 Visitor Package
Provider Charge: $16
Your Sale Price: $39,95
Your Profit: $23,95
25,000 Visitor Package
Provider Charge: $37,5
Your Sale Price: $99,95
Your Profit: $62,45
50,000 Visitor Package
Provider Charge: $67,5
Your Sale Price: $179,95
Your Profit: $112,45
100,000 Visitor Package
Provider Charge: $115
Your Sale Price: $299,95
Your Profit: $184,95
Ce primesti?
Tot site-ul
2 provideri de traffic
Site-ul este absolut nou.
Pret: $60
Metoda plata: Paypal.
Cine e interesat sa trimita PM.
-Gonzalez
-
Scriptalicious SEO Scripts and SEO Scripts Pro combine 77 of the most popular and useful SEO scripts and Webmaster SEO tools in one package!
Add 77 SEO scripts and SEO tools to your Web site in minutes.
Features:
* SEO Friendly URLs
* Powerful Admin CP
* Manage Tools & Categories
* Built-in CMS
* Site Links Manager
* WYSIWYG Editor
* Customizable Themes
* Includes 6 styles
Optimize Your Sites Study the Competition Save Time
Get More Backlinks Improve Rankings Get the Edge on SEO
SEO Scripts Pro
77 powerful SEO scripts. Adds 40 brand new tools to your SEO arsenal:
* Keyword Rich Domain Finder
* Link Extractor
* Keyword Extractor
* Twitter Links Finder
* Digg Links Checker
* Delicious Link Checker
* Bing Indexed Pages Checker
* Site Rank Checker
* Authority Link Checker
* .edu Backlink Checker
* .gov Backlink Checker
* DMOZ Backlink Checker
* Compete Ranking Checker
* Convert UNIX Timestamp
* Convert Date to Timestamp
* Email to Image Converter
* Alexa Multiple Rank Checker
* Multiple PageRank Checker
* Multiple Backlink Checker
* Robots.txt Checker
* Multiple PageHeat Checker
* Multiple Fake PageRank Checker
* Multiple .edu Backlink Checker
* Multiple .gov Backlink Checker
* Multiple DMOZ Backlink Checker
* Multiple IP Address Checker
* Multiple Reverse IP Lookup
* HTML Markup Validator
* CSS Validator
* Google Indexed Pages Checker
* Yahoo Indexed Pages Checker
* Multiple Compete Rank Checker
* Compete Statistics Checker
* MSN Live Indexed Pages Checker
* Multiple Compete Statistics Checker
* Multiple Digg Links Checker
* Multiple Delicious Link Checker
* Yahoo Directory Backlink Checker
* Webpage Size Checker
* Multiple Authority Link Checker
http://oron.com/17nr4qafcyrn/scriptalisiouspro.rar.html
-
Vandut se poate inchide topicul.
Multumesc.
-Gonzalez
-
Vandut se poate inchide topicul.
Multumesc.
-Gonzalez
-
Google.be
Google.as
Google.com.ar
Google.com.au
Google.at
Google.az
Google.be
Google.com.br
Google.vg
Google.bi
Google.ca
Google.cl
Google.com.co
Google.cd
Google.cg
Google.co.cr
Google.com.cu
Google.dk
Google.dj
Google.com.do
Google.com.sv
Google.com.fj
Google.fi
Google.fr
Google.gm
Google.de
Google.com.gr
Google.gl
Google.gg
Google.hn
Google.com.hk
Google.ie
Google.co.il
Google.it
Google.co.jp
Google.co.je
Google.kz
Google.lv
Google.co.ls
Google.com.ly
Google.li
Google.it
Google.li
Google.lu
Google.com.my
Google.com.mt
Google.mu
Google.com.mx
Google.fm
Google.ms
Google.com.np
Google.nl
Google.co.nz
Google.com.ni
Google.com.nf
Google.no
Google.com.pk
Google.com.pa
Google.com.pe
Google.com.ph
Google.pn
Google.pl
Google.pt
Google.com.pr
Google.ro
Google.com.ru
Google.rw
Google.sm
Google.com.sg
Google.co.za
Google.co.kr
Google.es
Google.sh
Google.com.vc
Google.se
Google.ch
Google.com.tw
Google.co.th
Google.com.tr
Google.com.ua
Google.ae
Google.co.uk
Google.com.uy
Google.uz
Google.co.ve
Google.com.vn
aeiwi.com - ae iwi Resources and Information. This website is for sale!
Exact Seek
Info Tiger
Scrub
The Web
Search UK
Aesop
walhello.com
szukacz.pl
caloga.com
Webwizard.at
infosniff.com
agada.de
biveroo.de
caloweb.de
camana.de
miko.de
Onlinepilot.de
Arianna.it
Libero.it
Gooru.pl
Yandex.ru
earchengine.de
Findonce.co.uk
Mirago.co.uk
Bigfinder.com
Exploit.net
Metawebsearch.com
Searchengine.fr
SearchEngine.nl
SearchEngine.uk
Google.com.py
eurip.com
WhatUseek.com
InfoTiger.com
Rediff.com
SearchUK.com
AxxaSearch
Uitati-va aici:
http://www.fullydown.net/URLsubmit/index.php
si dati submit
-Gonzalez
-
Site Name : JustDL.com
E site de warez baza pe CMS - DLE (Datalife engine)
Traficul nu e mare, acum 1 saptamana am facut site-ul. Astept traffic de la google, pentru ca am facut putin seo pe el.
Ce primesti?
Domain valabil pana in martie 2012
Database backup
Uploaderi
10$ banner.
Pret 20$
Site name: devilived.net
Traffic venind de la DDL-uri: nexus 2*, zunox 2* si la restul trebuie sa cer re-rate.
Traffic recent optimizat pentru google. On page si off page SEO.
Ce primesti?
domeniul valabil pana in iunie 2011
database backup
Uploaderi
15$ banner.
Ambele siteuri la 50 $!
Pret incepand de la 20 $ in sus
BIN: 50
De ce vind?
nu mai ma ocup cu warez, am castigat suficinet si ma axez pe alta nise de acum inainte.
-
Thanks.
-Gonzalez
-
Asta caut si eu Brabus, am o suma de bani si vreau sa o investes online. Mersi pentru idei baieti, mai astept.
-Gonzalez
-
Sa nu mai cautati, nah:
Site:
http://qip.ro/ultimate-submitter-kit/
Download:
http://www.brothersoft.com/ultimate-submitter-kit-437665.html
eNjoy
-Gonzalez
-
Salut, am creat si eu o aplicatie de tipul website submitter, este complet gratuita. Programele de pe piata de tipul Shareware m-au determinat sa imi construiesc singur o aplicatie de acest gen, si sa o ofer utilizatorilor. Primul nume al aplicatiei a fost Virtual Design Submitter, dar de la versiunea 3 m-am gandit sa ii dau numele de Ultimate Submitter Kit si este la versiunea 3.7.1.
Ce poti face cu aceasta aplicatie:
* Posibilitatea folosirii multiplelor profiluri de site,
* Salvarea unui profil creat si deschiderea ulterioara a acestuia,
* Multiple criterii de selectare a directoarelor web: Toate directoarele, Toate directoarele exceptand cele inscrise deja, Selecteaza directoare cu un anumit Page Rank, Selecteaza directoarele ce folosesc o anumita limba a interfetei,
* Vizualizarea progresului promovarii siteului in timp real,
* Generarea unui raport al promovarii siteului,
* Actualizarea bazei de date a directoarelor web,
* Sugerarea unui nou director web,
* Administrarea bazei de date a directoarelor web.
Google: Ultimate Submitter Kit 3.7.1
Sper ca acest mesaj sa nu fie considerat spam iar daca este imi cer scuze administratorilor.
Cu stima, Marian.
Ce trebuie sa faci? google: Ultimate Submitter Kit 3.7.1
-Gonzalez
-
Check your inbox.
-Gonzalez
-
The Multi-Poster series has just got seriously awesome! The fifth generation of the multi-poster engine is faster, smarter and best of all, can post to more website types! A new approach to the engine design allows for many website types to be added easily in as little as 10 or 20 lines of code.
Human verification routines such as captcha, reCaptcha and random questions are all supported, conveniently allowing you to type in the required data as and when needed. Once the topics have been created, the application provides links for the user to double-click and view the topics. This information is also logged and available for the user to browse as and when required.
Replying to topics is also supported, allowing you to post new data easily and conveniently!
The new allows the user to send as many as 250 topics at the same time via multi-threaded routines, and sports some ultra-cool "always logged in" abilities, increasing posting speed by 33% compared to previous versions, and helping bypass "bot detectors".
Features:
Process up to 250 threads at the same time! Now thats fast!
Support for many forum requirement types such as Topic Icons, Tags, Prefixes, etc.
Integrated link-checker allows you to check your download links before you send, or any time after.
Checks your post layout to ensure you dont send bad posts.
Integrated post preview and live previewer, allowing you to see your modifications real time.
Smart "always logged in" system improves speed and bot detection.
Licenta: 12 $
Metoda de plata: PayPal.
Normal licenta costa: 20 $
-Gonzalez
-
How to Get Search Engines to Discover (Index) All the Web Pages on Your Site
by Christopher Heng
If your site is one of those websites where only a few pages seem to be indexed by the search engines, this article is for you. It describes how you can provide the major search engines with a list of the all the pages on your website, thus allowing them to learn of the existence of pages which they may have missed in the past.
How do you Find Out which Pages of your Website is Indexed?
How do you know which pages of your site has been indexed by a search engine and which not? One way is to use "site:domain-name" to search for your site. This works with Google, Yahoo and Microsoft Live, although not with Ask.
For example, if your domain is example.com, type "site:example.com" (without the quotes) into the search field of the search engine. From the results list, you should be able to see all the pages which the search engine knows about. If you find that a page from your site is not listed, and you have not intentionally blocked it using robots.txt or a meta tag, then perhaps that search engine does not know about that page or has been unable to access it.
Steps to Getting the Search Engine to Discover and Index Your Whole Site
Here's what to do, when you discover that there are pages not indexed by the search engine.
Check Whether Search Engines are Blocked from that Page
The first thing to do is to check your robots.txt file, and make sure it complies with the rules of a robots.txt file. Many webmasters, new and old, unintentionally block a search engine from a part of their site by having errors in their robots.txt file.
Another thing you might want to do is to make sure that your web page does not have a meta tag that prevents a robot from indexing a particular page. This may occur if you have ever put a meta "noindex" tag on the page, and later wanted it indexed but forgot to remove it.
Create a File Using the Sitemap Protocol
The major search engines, Google, Yahoo, Live and Ask, all support something known as a Sitemap file. This is not the "Site Map" that you see on many websites, including thesitewizard.com. My Site Map and others like it are primarily designed to help human beings find specific pages on the website. The sitemap file that uses the Sitemap protocol is, instead, designed for search engines, and is not at all human-friendly.
Sitemaps have to adhere to a particular format. The detailed specifications for this can be found at the sitemaps.org website. It is not necessary to use every aspect of the specification to create a site map if all you want is to make sure the search engines locate all your web pages. Details on how to create your own sitemap will be given later in this article.
Modify Your Robots.txt File for Sitemaps Auto-Discovery
As a result of the sitemap protocol, an extension to the robots.txt file has been agreed by the search engines. Once you have finished creating the sitemap file and uploaded it to your website, modify your robots.txt file to include the following line:
Sitemap:
http://www.example.com/name-of-sitemap-file.xml
You should change the web address ("URL") given to the actual location of your sitemap file. For example, change
www.example.com
to your domain name and "name-of-sitemap-file.xml" to the name that you have given your sitemap file.
If you don't have a robots.txt file, please see my article on robots.txt for more information on how to create one. The article can be found at
http://www.wjunction.com/showthread.php?t=73109
The search engines that visit your site will automatically look into your robots.txt file before spidering your site. When they read the file, they will see the sitemap file listed and load it for more information. This will enable them to discover the pages that they have missed in the past. In turn, this will hopefully send them to index those files.
How to Create a Sitemap File
A sitemap file that follows the Sitemap Protocol is just a straightforward ASCII text file. You can create it using any ordinary ASCII text editor. If you use Windows, Notepad (found in the Accessories folder of your Start menu) can be used. Do not use a word processor like Microsoft Office or Word.
By way of example, take a look at the following sample sitemap file.
You will notice that a sitemap file begins with the text
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">and ends with
</urlset>
Those portions of the sitemap file are invariant. All sitemaps have to begin and end this way, so you can simply copy them from my example to your own file.
Next, notice that every page on the website (that you want indexed in the search engine) is listed in the sitemap, using the following format:
<url><loc>http://www.example.com/</loc></url>
where
http://www.example.com/
should be replaced by the URL of the page you want indexed. In other words, if you want to add a page, say,
http://www.example.com/sing-praises-for-thesitewizard.com.html
to your website, just put the web address for that page between
<url><loc>
and
</loc></url>
, and place the entire line inside the section demarcated by
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
and
</urlset>
.
To make your job simpler, just copy the entire example sitemap that I gave in the example above, replace all the example URLs with your own page addresses, add any more that you like, and you're done.
Save the file under any name you like. Most people save it with a ".xml" file extension. If you don't have any particular preference, call it "sitemap.xml". If you use Notepad instead of a decent text editor.
Remember to update your robots.txt file as mentioned earlier to include the URL of your sitemap file, so that the search engines can learn of the existence of the file.
Note: a sitemap file cannot have more than 50,000 URLs (web addresses) nor be bigger than 10 MB. If yours is bigger than that, you'll have to create multiple sitemap files. Please see the Sitemaps site on how this can be done.Conclusion: Dealing with Missing Pages in the Search Engine's Index
If you have pages on your website that seem to be omitted from the search engine indices, following the tips in this article will help you make sure that the search engines learn of all the pages on your web site. Of course, whether they actually go about spidering and listing them is another matter. However, with the sitemap file, you can at least know that they are aware of all the available pages on your site.
-
How to Set Up a Robots.txt File
Writing a robots.txt file is extremely easy. It's just an ASCII text file that you place at the root of your domain. For example, if your domain is
www.example.com
, place the file at
www.example.com/robots.tx
t. For those who don't know what an ASCII text file is, it's just a plain text file that you create with a type of program called an ASCII text editor. If you use Windows, you already have an ASCII text editor on your system, called Notepad. (Note: only Notepad on the default Windows system is an ASCII text editor; do not use WordPad, Write, or Word.)
The file basically lists the names of spiders on one line, followed by the list of directories or files it is not allowed to access on subsequent lines, with each directory or file on a separate line. It is possible to use the wildcard character "*" (just the asterisk, without the quotes) instead of naming specific spiders. When you do so, all spiders are assumed to be named. Note that the robots.txt file is a robots exclusion file (with emphasis on the "exclusion") — there is no universal way to tell spiders to include any file or directory.
Take the following robots.txt file for example:
User-agent: *
Disallow: /cgi-bin/The above two lines, when inserted into a robots.txt file, inform all robots (since the wildcard asterisk "*" character was used) that they are not allowed to access anything in the
cgi-bin directory
and its descendents. That is, they are not allowed to access cgi-bin/whatever.cgi or even a file or script in a subdirectory of cgi-bin, such as
/cgi-bin/anything/whichever.cgi.
If you have a particular robot in mind, such as the Google image search robot, which collects images on your site for the Google Image search engine, you may include lines like the following:
User-agent: Googlebot-Image
Disallow: /This means that the Google image search robot, "Googlebot-Image", should not try to access any file in the root directory "/" and all its subdirectories. This effectively means that it is banned from getting any file from your entire website.
You can have multiple Disallow lines for each user agent (ie, for each spider). Here is an example of a longer robots.txt file:
User-agent: *
Disallow: /images/
Disallow: /cgi-bin/
User-agent: Googlebot-Image
Disallow: /The first block of text disallows all spiders from the images directory and the cgi-bin directory. The second block of code disallows the Googlebot-Image spider from every directory.
It is possible to exclude a spider from indexing a particular file. For example, if you don't want Google's image search robot to index a particular picture, say,
mymugshot.jpg
, you can add the following:
User-agent: Googlebot-Image
Disallow: /images/mymugshot.jpgRemember to add the trailing slash ("/") if you are indicating a directory. If you simply add
User-agent: *
Disallow: /privatedatathe robots will be disallowed from accessing privatedata.html as well as privatedataandstuff.html as well as the directory tree beginning from /privatedata/ (and so on). In other words, there is an implied wildcard character following whatever you list in the Disallow line.
Where Do You Get the Name of the Robots?
If you have a particular spider in mind which you want to block, you have to find out its name. To do this, the best way is to check out the website of the search engine. Respectable engines will usually have a page somewhere that gives you details on how you can prevent their spiders from accessing certain files or directories.
Common Mistakes in Robots.txt
Here are some mistakes commonly made by those new to writing robots.txt rules.
- It's Not Guaranteed to Work
- As mentioned earlier, although the robots.txt format is listed in a document called "A Standard for Robots Exclusion", not all spiders and robots actually bother to heed it. Listing something in your robots.txt is no guarantee that it will be excluded. If you really need to block a particular spider ("bot"), you should use a .htaccess file to block that bot. Alternatively, you can also password-protect the directory (also with a .htaccess file).
- Don't List Your Secret Directories
- Anyone can access your robots file, not just robots. For example, typing
http://www.google.com/robots.txt
will get you Google's own robots.txt file. I notice that some new webmasters seem to think that they can list their secret directories in their robots.txt file to prevent that directory from being accessed. Far from it. Listing a directory in a robots.txt file often attracts attention to the directory. In fact, some spiders (like certain spammers' email harvesting robots) make it a point to check the robots.txt for excluded directories to spider.
- Only One Directory/File per Disallow line
- Don't try to be smart and put multiple directories on your Disallow line. This will probably not work the way you think, since the Robots Exclusion Standard only provides for one directory per Disallow statement.
How to Specify All the Files on Your Website
A recent update to the robots.txt format now allows you to link to something known as a sitemaps protocol file that gives search engines a list of all the pages on your website.
- It's Not Guaranteed to Work
-
100 de template-uri pentru MP4
http://www.filesonic.com/file/171671222/Temp.rar
-Gonzalez
LiveJasmin clona, sau ceva similar
in Off-topic
Posted
Aveti clona LiveJasmin sau ceva similar? Ma intereseaza pentru un proiect.
Multumesc!
-Gonzalez