Popular web search engines rely on crawling to build an index of the World Wide Web. This continuous process to keep the index fresh generates an enormous amount of traffic. By far the largest part of the web remains unindexed, as crawlers are unaware of the existence of web pages and they have difficulties crawling dynamically generated content. Now, assume that local web sites can be indexed by peers. Peers cooperate with a broker by sending a part of their index. Receiving indices from many peers, the broker gains a global overview of the peers' content. When a user poses a query to a broker, the broker selects a few peers to which it forwards the query. Selected peers should be promising to create a good result set with many relevant documents. The result sets are merged at the broker and sent to the user. A research prototype, Sophos, was developed for this scenario, combining highly discriminative keys, query-driven indexing and a PageRank-like algorithm to select promisingpeers. Sophos is compared to a language modeling baseline system.