Re: problems with large directories - not only ext2

Matthew Wilcox (Matthew.Wilcox@genedata.com)
Thu, 12 Aug 1999 17:26:58 +0200


On Thu, Aug 12, 1999 at 09:53:03AM -0400, Robert G. Brown wrote:
> This is one
> of several reasons that the tendency of modern Unices (and linux
> distributions) to pack "everything" into a single directory (e.g.
> /usr/info) worries me. With a few hundred files, performance is ok, but
> if that ever scales up to a few thousand (as it might if everything is
> truly fully documented and dumped in that one directory) then
> performance is going to start to lag a bit.

All current versions of info have the ability to access files in a
directory. ie instead of looking at /usr/info/gcc.info.gz, it looks at
a file in /usr/info/gcc/. makeinfo (as distributed) does not have the
ability to generate info files in this way. The changes were pretty
minor, but when the new version came out, I had insufficiently recent
versions of tools to recompile makeinfo and check that my changes worked,
and then I moved and forgot about it. Someone sufficiently motivated
could redo it; it's not much work.

-- 
Matthew Wilcox <willy@bofh.ai>
"Windows and MacOS are products, contrived by engineers in the service of
specific companies. Unix, by contrast, is not so much a product as it is a
painstakingly compiled oral history of the hacker subculture." - N Stephenson

- To unsubscribe from this list: send the line "unsubscribe linux-kernel" in the body of a message to majordomo@vger.rutgers.edu Please read the FAQ at http://www.tux.org/lkml/