Intel bets big on AI ecosystem in India


In line with the global launch of the second generation of Xeon processor in San Francisco. Chip maker Intel has introduced one of the widest range of innovations in its data-centric portfolio in the history of Intel Corp. The legacy of Xeon has been existing over the last decades for Intel, however, after a gap of four years, the company has launched its 2nd-generation Intel Xeon Scalable processor. It claims to have made significant improvements over first generation processor, along with computing, memory and networking slew of products.

Speaking on the sidelines of the product launch in Bengaluru, Prakash Mallya, Vice President, and Managing Director, Sales and Marketing Group, Intel India, explains how the new offerings will boost up the computing performance that will pave way for emerging technologies like AI, ML, 5G, and cloud.

Intel has been betting big on the growth of AI in India, of late the company has trained as many as 99,000 developers, students and professors in artificial intelligence (AI) in India since April 2017, against a target of 15,000 for the first year of its program. It has also tied up with premier educational institutes like the IITs in Delhi, Mumbai, Kharagpur, Kanpur, Chennai, and IIITs in Bengaluru, Hyderabad, BITS Pilani, ISI Kolkata, IISc Bangalore, CDAC and companies like Shell and TCS among others for training.

“At Intel, our endeavor is to increase the benefits of AI to large as well as small companies. Another driving force is the rollout of 5G. It will create an impact on the intelligent edge, it will force operators to create a network which is agile, scalable and has more computing power and performance. Intelligent Edge is critical for 5G, informs Mallya.

The biggest enhancement from the first generation to the second generation is making of specific workloads optimised for our platform. For instance, the second generation Xeon, with deep learning, boosts performance 14x. It will cater to the growing need for AI  since AI is the fastest growing workload in the data centre. In network functions, the transformation of networks is happening from proprietary to open-source based on x86.

Through optimisation, It has been able to make specific workloads grow faster. Storage innovations in the second generation Xeon platforms, along with Intel Optane DC persistent memory, is game-changing in terms of big data analytics, in-memory computing in real time for management. Any of use cases require high capacity and high performing memory. Until now, it was not feasible economically to deploy. The DC memory is estimated to be 50 per cent of the cost of DRAM. Going by the adoption of the first generation Xeon, Intel is confident to see more appetite from partners and customers in India. The market needs will push the ball forward in that direction, as customers and partners want to innovate and disrupt themselves as well as the industry.

AI is not an application, it will be embedded in the applications. AI inferences get embedded in the application which will accelerate the optimisation with Intel DL Boost technology. It impacts the performance significantly and makes it flexible to deploy as well. “We would see deployment become faster; our partners and customers can have more AI driven opportunities. Overall I see only two per cent of created data is being analysed as it requires more compute performance. Now, AI inference workloads like image-recognition, object-detection, and image-segmentation within a data center, enterprise, and intelligent-edge computing environment. Intel has worked extensively with ecosystem partners to optimise frameworks and applications that take full advantage of Intel DL Boost technology. Customers can choose enhanced tools like OpenVINO to ease deployment,” says Mallya.


Please enter your comment!
Please enter your name here