x
Loading
 Loading
Join 10,000+ Fans Join 5,000+ Followers Join 1,000+ Members Join 10,000+ Subscribers Subscribe to Daily Updates
Follow linuxdlsazine
Hello, Guest | Login | Register

One Billion Dollars! Wait… I Mean One Billion Files!!!

The world is awash in data. This fact is putting more and more pressure on file systems to efficiently scale to handle increasingly large amounts of data. Recently, Ric Wheeler from Redhat experimented with putting 1 Billion files in a single file system to understand what problems/issues the Linux community might face in the future. Let’s see what happened…

As Ric pointed out in a presentation he made at 2010 LinuxCon, 1 billion files is very conceivable from a capacity perspective. If you use 1KB files, then 1 billion files (1,000,000,000) takes up only 1TB. If you use 10KB files, then you need 10TB’s to accommodate 1 billion files (not to difficult to imagine even in a home system). If you 100KB files, then you need 100TB’s to hold 1 billion files. Again it’s not hard to imagine 100 TB’s in a storage array. The point is that with smaller files, current storage devices can easily accommodate 1 billion files from a capacity perspective.

Ric built a 100TB storage array (raw capacity) for performing some tests. But, as previously mentioned, you don’t need much capacity for performing these experiments. According to Ric the life cycle of a file system has several stages

Downloads
BlackBerry
The CIO's Guide to Mobile Security
M86 Security
Real Time Code Analysis: Proactive Protection Against Malware Threats
Raritan
Measuring Power in Your Data Center
Astaro
Astaro Outperforms Cisco as an Integrated Security Solution at Devine Millimet
Columns
Ken Hess on
Systems
Joe Brockmeier on
Software
Frank Ableson on
Mobile
Jeffrey Layton on
Storage
Douglas Eadline on
HPC
Chris Smart on
Distros