Through a virtual conference, Intel organised its Intel Labs Day 2020 on December 3. Themed’ In Pursuit of 1000X: Disruptive Research for the Next Decade in Computing’, this event was headed by Rich Uhlig — Intel senior fellow and vice president and director of Intel Labs and joined in by other domain experts.
Staying true to its theme, Intel shined the spotlight on its several research initiatives across domains where Intel researchers are leading efforts to address critical challenges in the future of computing. The event focused on photonics, quantum computing, confidential computing, machine programming, and neuromorphic computing, among others. The event also brought out a few key announcements, and updates about new and emerging technologies worked out by Intel. In this article, we have listed these critical announcements:
Integrated Photonics for Data Centers
At the event, Intel showcased the integration of photonics with low-cost, high-volume silicon. A ‘long-standing vision’, with this integration, Intel hopes to enhance the critical process of optical interconnects and also address the issue of data workloads overwhelming the network traffic in data centres.
With growing data-centric workloads within a data centre due to increased data movement between servers is overwhelming the network infrastructure. It is believed that the industry might be rapidly approaching the limits of electrical input/output, which, when combined with the growth in the bandwidth demand for computing has resulted in an I/O power wall. This power wall limits the availability of power for other compute processes.
One possible solution is by bringing optical I/O directly into the servers. This would be achieved by connecting future data centres and networks and overcoming the limits of electrical I/O by integrating optical and silicon technologies.
Introducing the same, James Jaussi, the senior principal engineer and director of PHY Lab, Intel Labs, said, “Our research on tightly integrating photonics with CMOS silicon can systematically eliminate barriers across cost, power and size constraints to bring the transformative power of optical interconnects to server packages.”
Neuromorphic Ecosystem Growth
Director of Intel’s Neuromorphic Computing Lab, Mike Davies presented an update on the progress made by the Intel Neuromorphic Research Community (INRC) that consists of 100 academic groups, government labs, and research institutes. The community which was founded in 2018 works to develop neuromorphic computing, which are circuits that mimic the human nervous system to serve as building blocks for supercomputers with up to 1000X more power.
Apart from introducing new industry partnerships with Lenovo, Logitech, among others, Intel also shared benchmarking updates:
Download our Mobile App
Accenture’s testing found that Intel’s flagship neuromorphic computing hardware Loihi chip showed better performance for voice command recognition as compared to standard GPU.
- A joint exercise between Accenture and INRC demonstrated that Loihi could learn new gestures in just a few exposures.
- Loihi also demonstrated superior performance in image retrieval.
- This neuromorphic chip from Intel can solve optimisation and search problems 1000 more efficiently and 100 times faster than traditional CPUs
During the event, Intel introduced Horse Ridge II, the second-generation cryogenic chip. This chip, according to Intel, will support “enhanced capabilities and higher levels of integration for elegant control of the quantum system”. Some of its most important features include reading and manipulating qubit states (quantum bits) and overcoming scalability, which is quantum computing’s one of the biggest challenges.
Intel unveiled its machine programming research system, ControlFlag that can autonomously detect errors in code. As per the chipmaker, this self-supervised system demonstrates itself as a ‘powerful productivity tool’ that will assist software developers in performing labour-intensive tasks of debugging. In fact, in preliminary tests, ControlFlag trained and learnt defects on over a billion unlabeled lines of production-quality code.
Its bug detection capabilities are supported by machine learning, programming languages, compilers, and computer systems. It learns from verified examples to normal coding patterns based on which it detects any deviance or anomaly which may cause a bug. One of its most impressive features is its unsupervised approach to pattern recognition that enables it to intrinsically learn and adapt to the developer’s style.
Subscribe to our NewsletterGet the latest updates and relevant offers by sharing your email.
You can write for us and be one of the 500+ experts who have contributed stories at AIM. Share your nominations here.
I am a journalist with a postgraduate degree in computer network engineering. When not reading or writing, one can find me doodling away to my heart’s content.