Exclusive Content:

The No-Compromise Gaming PC Build under ₹30,000 in 2023

In search of a powerful yet efficient and budget-friendly...

Intel’s AI hardware accelerators are better than any GPU currently available on the market

Since, the craze of ChatGPT, AI has come to the forefront of everything as more and more users have started using it for mainstream usage. However, for such AI needs, the need for high-speed computers plays an important role, while ChatGPT relies on NVIDIA’s GPUs extensively, Intel has stepped up to satisfy the needs of such high-end computing demands.

Intel’s democratization of AI and support for an open ecosystem will meet the computing needs for generative AI

As generative AI models get bigger, power efficiency becomes a critical factor in driving productivity with a wide range of complex AI workload functions from data pre-processing to training and inference. Developers need a build-once-and-deploy-everywhere approach with flexible, open, energy efficient and more sustainable solutions that allow all forms of AI, including generative AI, to reach their full potential.

Intel’s AI hardware accelerators are better than any GPU currently available on the market
Automatic evaluation of generated language output by BLOOMZ models (up to 176B parameters) on 100K LMentry prompts, using Habana Gaudi accelerators

Intel is taking steps to ensure it is the obvious choice for enabling generative AI with Intel’s optimization of popular open-source frameworks, libraries, and tools to extract the best hardware performance while removing complexity.  Intel’s AI hardware accelerators and inclusion of built-in accelerators to 4th Gen Intel® Xeon® Scalable processors provide performance and performance per watt gains to address the performance, price and sustainability needs for generative AI.

Recently, Hugging Face, the top open-source library for machine learning, published results that show inference runs faster on Intel’s AI hardware accelerators than any GPU currently available on the market, with Habana® Gaudi®2 running inference 20% faster on a 176 billion parameter model than Nvidia’s A100.

Intel Xeon

In addition, it has also demonstrated power efficiency when running a popular computer vision workload on a Gaudi2 server, showing a 1.8x advantage in throughput-per-watt over a comparable A100 server. So, as the AI market heats up, Intel, the silicon giant for decades now is ready to challenge NVIDIA to offer its computational bandwidth to customers who look to progress with AI even further.

Read more about this announcement here.


IPL 2023 Final: CSK defeats GT in a nail-biter to bag their 5th IPL trophy

Once again, the Chennai Super Kings emerge victoriously in...

Purple Cap in IPL 2023: Top 10 players with the most wickets in IPL 2023

Purple Cap in IPL 2023: The Indian Premier League...

Orange Cap in IPL 2023: Top 10 players with the most runs in IPL 2023

The Indian Premier League (IPL) is the most popular...

Top 10 richest cricket boards in the world in 2023

Know the Top 10 Richest Cricket Boards in the...

Don't miss

The Sleep Company [CPS] IN


Raunak Saha
Raunak Saha
A cs engineer by profession but foodie from heart. I am tech lover guy who has a passion for singing. Football is my love and making websites is my hobby.

Powerful AMD Radeon PRO W7000 Series Workstation Graphics Cards Launched

AMD (NASDAQ: AMD) has announced its latest and most powerful workstation graphics cards to date, the AMD Radeon PRO W7000 Series. Built on the...

What is the Airtel Minimum Recharge Plan in 2023?

Know what is the Airtel Minimum Recharge Plan - The Ultimate Guide Bharti Airtel, India's second-largest telecom operator with millions of active customers, is committed...

Portronics Unveils My Buddy K9 Laptop Stand, Redefining Professional Workstations

Portronics, a leading innovator in the digital and portable consumer electronics market, has introduced the My Buddy K9 Laptop Stand as the latest addition...


Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.