ALIWEB allows users to submit the locations of index files on their sites[3][4] which enables the search engine to include webpages and add user-written page descriptions and keywords. This empowers webmasters to define the terms that would lead users to their pages, and also avoided setting bots (e.g.the Wanderer, JumpStation) which used up bandwidth.
Martijn Koster, who was also instrumental in the creation of the Robots Exclusion Standard,[5][6] detailed the background and objectives of ALIWEB with an overview of its functions and framework in the paper he presented at CERN.[2] Koster left Nexor and in the following years development ceased until a new company took over the project. It was discovered that the database file had actually exceeded the range of search so therefore when you entered a search term it would not search the entire database and weight the results, it would only search from the beginning of the database until it either ran out of results to show or exceeded the number of results requested. The new company developers fixed this by weighting the results and searching the entire database before displaying the results.
^ ab"List of PostScript files for the WWW94 advance proceedings". First International Conference on the World-Wide Web. June 1994. Archived from the original on 2018-05-08. Retrieved 2007-06-03. Title: "Aliweb - Archie-Like Indexing in the Web." Author: Martijn Koster. Institute: NEXOR Ltd., UK. PostScript, Size: 213616, Printed: 10 pages
^Martijn Koster. "Robots in the Web: threat or treat?". Reprinted with permission from ConneXions, The Interoperability Report, Volume 9, No. 4, April 1995. Archived from the original on 2007-01-02. Retrieved 2007-01-03.