Santa Clara and San Francisco, CA, April twelfth, 2022
Powered by deep learning, transformer models deliver state-of-the-art performance on a wide selection of machine learning tasks, similar to natural language processing, computer vision, speech, and more. Nonetheless, training them at scale often requires a considerable amount of computing power, making the entire process unnecessarily long, complex, and dear.
Today, Habana® Labs, a pioneer in high-efficiency, purpose-built deep learning processors, and Hugging Face, the house of Transformer models, are completely happy to announce that they’re joining forces to make it easier and quicker to coach high-quality transformer models. Due to the mixing of Habana’s SynapseAI software suite with the Hugging Face Optimum open-source library, data scientists and machine learning engineers can now speed up their Transformer training jobs on Habana processors with just a couple of lines of code and revel in greater productivity in addition to lower training cost.
Habana Gaudi training solutions, which power Amazon’s EC2 DL1 instances and Supermicro’s X12 Gaudi AI Training Server, deliver price/performance as much as 40% lower than comparable training solutions and enable customers to coach more while spending less. The combination of ten 100 Gigabit Ethernet ports onto every Gaudi processor enables system scaling from 1 to 1000’s of Gaudis with ease and cost-efficiency. Habana’s SynapseAI® is optimized—at inception—to enable Gaudi performance and value, supports TensorFlow and PyTorch frameworks, with a deal with computer vision and natural language processing applications.
With 60,000+ stars on Github, 30,000+ models, and hundreds of thousands of monthly visits, Hugging Face is one in all the fastest-growing projects in open source software history, and the go-to place for the machine learning community.
With its Hardware Partner Program, Hugging Face provides Gaudi’s advanced deep learning hardware with the final word Transformer toolset. This partnership will enable rapid expansion of the Habana Gaudi training transformer model library, bringing Gaudi efficiency and ease of use to a wide selection of customer use cases like natural language processing, computer vision, speech, and more.
“We’re excited to partner with Hugging Face and its many open-source developers to handle the growing demand for transformer models that profit from the efficiency, usability, and scalability of the Gaudi training platform”, said Sree Ganesan, head of software product management, Habana Labs.
“Habana Gaudi brings a brand new level of efficiency to deep learning model training, and we’re super excited to make this performance easily accessible to Transformer users with minimal code changes through Optimum”, said Jeff Boudier, product director at Hugging Face.
To learn find out how to start training with Habana Gaudi, please visit https://developer.habana.ai.
For more information on the Hugging Face and Habana Gaudi collaboration, please visit https://huggingface.co/Habana.
