|
This video is part of the appearance, “X-IO Technologies Presents at Storage Field Day 13“. It was recorded as part of Storage Field Day 13 at 10:30-12:30 on June 16, 2017.
Watch on YouTube
Watch on Vimeo
In his presentation at Storage Field Day 13, Richard Lary, Chief Scientist at X-IO Technologies, delves into the complexities and innovations in deduplication technology, emphasizing the significant role of mathematics in enhancing performance. Lary begins by highlighting the resource-intensive nature of deduplication, which involves substantial memory, CPU cycles, and disk accesses. He explains that deduplication typically relies on computationally intensive signature techniques to compare incoming data with existing data, necessitating a robust and persistent database of signatures. This database must handle petascale systems with billions of entries, survive power and controller failures, and maintain high write throughput despite the challenges posed by the random nature of hash-based signatures. Lary critiques the inefficiency of traditional caching methods in this context and underscores the need for a high-throughput, persistent mapping database to manage the dynamic nature of deduplication.
Lary then introduces the concept of using non-crypto signatures, specifically MetroHash, to improve deduplication performance. He explains that while traditional crypto hashes like SHA-1 and SHA-256 are secure, they are slow and CPU-bound, making them less suitable for high-performance storage systems. In contrast, non-crypto hashes like MetroHash are significantly faster and can handle the high throughput demands of modern storage systems. Lary also discusses the innovative use of a “bouquet filter,” a variation of the bloom filter, to efficiently manage the deduplication process. This approach involves using multiple small bloom filters to reduce the computational overhead and improve performance. Lary hints at a proprietary method developed by X-IO to further optimize deduplication by treating unique data differently, thereby reducing wasted resources and enhancing overall system efficiency. This method, still under patent consideration, promises to significantly improve deduplication performance while maintaining data integrity.
Personnel: Richard Lary