Austin-based Mythic has launched the Mythic Analog Matrix Processor (Mythic AMP) — a single-chip analog computation device. The M1076 AMP uses Mythic Analog Compute Engine (ACE) to deliver the compute resources of a GPU at up to a tenth of the power consumption.
With a 3-watt power draw, the M1076 can perform up to 25 trillion operations per second (TOPS). The new lineup includes a single chip, a PCIe M2 card for low-footprint applications, and a PCIe card with up to 16 chips. Now, edge devices can execute complex AI applications at greater resolutions and frame rates, resulting in superior inference results. Computation happens at the same place where data is stored.
Sign up for your weekly dose of what's up in emerging technology.
Last month, Mythic raised $70 million in Series C funding, with Blackrock as the lead investor and co-led by Hewlett Packard Enterprise (HPE).
Why Mythic AMP?
In traditional computers, data is transferred from the DRAM memory to the CPU at regular intervals. Memory holds programs and data. The processor and memory in computers are separate, and data moves between the two. Over the years, processor speeds have seen a drastic uptick. Meanwhile, memory advancements have been mainly focused on density — the ability to store more data in less space – rather than transfer rates, leading to latency.
Simply put, processors, no matter how fast they are, have to sit idle while fetching data from memory and are dependent on the rate of transfer — Von Neumann limitation. Thereby, merging compute and memory in a single device, analog AI eliminates the von Neumann bottleneck, resulting in dramatic performance gains. In addition, tasks can be completed in a fraction of the time and with a lot less energy because there is no data transit.
Each Mythic ACE is accompanied by a digital subsystem including a 32-bit RISC-V nano processor, 64KB of SRAM, SIMD vector engine and a high-throughput network-on-chip (NoC) router. The Analog Matrix Processor is capable of delivering power-efficient AI inference at up to 25 TOPS. “Edge devices can now deploy powerful AI models without the challenges of high power consumption, thermal management, and form-factor constraints,” as per the company.
AI on edge
Mythic’s primary focus is on edge AI deployments. The company also provides server-class compute in data centres. Edge AI can be used by businesses to deploy ML models that operate locally on edge devices. However, edge AI faces some challenges:
- Low Power: The device’s power and related heat grow as more functions and capabilities are added. Sometimes they are powered with Power-over-Ethernet (PoE) having limited power budgets. Devices need to exhibit strong performance even at 0.5 or 2W. Power should be as near to zero when not in use, and switching between these different modes should be rapid and simple.
- Small Size: AI algorithms running at the data source have minimum latency issues and no loss of accuracy due to video compression; hence, there are no requirements for large PCIe cards, big heatsinks, or fans. The entire system needs to fit on a 22mm x 30mm M.2 A+E card for others. Even with larger PCIe cards, the size of the accelerator and cooling solution determines how much AI can be crammed in.
- Cost-effectiveness: The capability to deliver high-power computing at an affordable and effective price gives customers much freedom to scale up to customer demand.
To date, the company has raised $165.2 million for easy and cost-effective deployment of powerful AI for the smart home, smart city, AR/VR, drone, video surveillance, and even manufacturing.