MITB Banner

World’s First AI Analog Chip

Share

Austin-based Mythic has launched the Mythic Analog Matrix Processor (Mythic AMP) — a single-chip analog computation device. The M1076 AMP uses Mythic Analog Compute Engine (ACE) to deliver the compute resources of a GPU at up to a tenth of the power consumption. 

With a 3-watt power draw, the M1076 can perform up to 25 trillion operations per second (TOPS). The new lineup includes a single chip, a PCIe M2 card for low-footprint applications, and a PCIe card with up to 16 chips. Now, edge devices can execute complex AI applications at greater resolutions and frame rates, resulting in superior inference results. Computation happens at the same place where data is stored.

Last month, Mythic raised $70 million in Series C funding, with Blackrock as the lead investor and co-led by Hewlett Packard Enterprise (HPE). 

Why Mythic AMP?

In traditional computers, data is transferred from the DRAM memory to the CPU at regular intervals. Memory holds programs and data. The processor and memory in computers are separate, and data moves between the two. Over the years, processor speeds have seen a drastic uptick. Meanwhile, memory advancements have been mainly focused on density — the ability to store more data in less space – rather than transfer rates, leading to latency.

Simply put, processors, no matter how fast they are, have to sit idle while fetching data from memory and are dependent on the rate of transfer — Von Neumann limitation. Thereby, merging compute and memory in a single device, analog AI eliminates the von Neumann bottleneck, resulting in dramatic performance gains. In addition, tasks can be completed in a fraction of the time and with a lot less energy because there is no data transit.

Each Mythic ACE is accompanied by a digital subsystem including a 32-bit RISC-V nano processor, 64KB of SRAM, SIMD vector engine and a high-throughput network-on-chip (NoC) router. The Analog Matrix Processor is capable of delivering power-efficient AI inference at up to 25 TOPS. “Edge devices can now deploy powerful AI models without the challenges of high power consumption, thermal management, and form-factor constraints,” as per the company.

AI on edge

Mythic’s primary focus is on edge AI deployments. The company also provides server-class compute in data centres. Edge AI can be used by businesses to deploy ML models that operate locally on edge devices. However, edge AI faces some challenges:

  • Low Power: The device’s power and related heat grow as more functions and capabilities are added. Sometimes they are powered with Power-over-Ethernet (PoE) having limited power budgets. Devices need to exhibit strong performance even at 0.5 or 2W. Power should be as near to zero when not in use, and switching between these different modes should be rapid and simple.
  • Small Size: AI algorithms running at the data source have minimum latency issues and no loss of accuracy due to video compression; hence, there are no requirements for large PCIe cards, big heatsinks, or fans. The entire system needs to fit on a 22mm x 30mm M.2 A+E card for others. Even with larger PCIe cards, the size of the accelerator and cooling solution determines how much AI can be crammed in.
  • Cost-effectiveness: The capability to deliver high-power computing at an affordable and effective price gives customers much freedom to scale up to customer demand.

To date, the company has raised $165.2 million for easy and cost-effective deployment of powerful AI for the smart home, smart city, AR/VR, drone, video surveillance, and even manufacturing.

PS: The story was written using a keyboard.
Share
Picture of kumar Gandharv

kumar Gandharv

Kumar Gandharv, PGD in English Journalism (IIMC, Delhi), is setting out on a journey as a tech Journalist at AIM. A keen observer of National and IR-related news.
Related Posts

CORPORATE TRAINING PROGRAMS ON GENERATIVE AI

Generative AI Skilling for Enterprises

Our customized corporate training program on Generative AI provides a unique opportunity to empower, retain, and advance your talent.

Upcoming Large format Conference

May 30 and 31, 2024 | 📍 Bangalore, India

Download the easiest way to
stay informed

Subscribe to The Belamy: Our Weekly Newsletter

Biggest AI stories, delivered to your inbox every week.

AI Courses & Careers

Become a Certified Generative AI Engineer

AI Forum for India

Our Discord Community for AI Ecosystem, In collaboration with NVIDIA. 

Flagship Events

Rising 2024 | DE&I in Tech Summit

April 4 and 5, 2024 | 📍 Hilton Convention Center, Manyata Tech Park, Bangalore

MachineCon GCC Summit 2024

June 28 2024 | 📍Bangalore, India

MachineCon USA 2024

26 July 2024 | 583 Park Avenue, New York

Cypher India 2024

September 25-27, 2024 | 📍Bangalore, India

Cypher USA 2024

Nov 21-22 2024 | 📍Santa Clara Convention Center, California, USA

Data Engineering Summit 2024

May 30 and 31, 2024 | 📍 Bangalore, India