
Try OCI for free at This episode is sponsored by Oracle.
OCI is the next-generation cloud designed for every workload – where you can run any application, including any AI projects, faster and more securely for less.
On average, OCI costs 50% less for compute, 70% less for storage, and 80% less for networking.
Join Modal, Skydance Animation, and today's innovative AI tech companies who upgraded to OCI…and saved.
Why is AI moving from the cloud to our devices, and what makes on device intelligence finally practical at scale?
In this episode of Eye on AI, host Craig Smith speaks with Christopher Bergey, Executive Vice President of Arm's Edge AI Business Unit, about how edge AI is reshaping computing across smartphones, PCs, wearables, cars, and everyday devices.
We explore how Arm v9 enables AI inference at the edge, why heterogeneous computing across CPUs, GPUs, and NPUs matters, and how developers can balance performance, power, memory, and latency.
Learn why memory bandwidth has become the biggest bottleneck for AI, how Arm approaches scalable matrix extensions, and what trade offs exist between accelerators and traditional CPU based AI workloads.
You will also hear real world examples of edge AI in action, from smart cameras and hearing aids to XR devices, robotics, and in car systems.
The conversation looks ahead to a future where intelligence is embedded into everything you use, where AI becomes the default interface, and why reliable, low latency, on device AI is essential for creating experiences users actually trust.
Stay Updated: Craig Smith on X: Eye on A.
I.
on X:
ARM客户端业务部高级副总裁Chris Bergey在访谈中深入探讨了ARM架构如何从智能手机革命的基石,演变为当今边缘AI计算的核心驱动力。访谈揭示了ARM通过异构计算、能效优化和开发者生态,正推动AI从云端向设备端迁移,重塑人机交互的未来。
结语:ARM正以“隐形引擎”之力,推动边缘AI从概念落地为日常体验。其成功关键在于:三十年生态积累、异构计算的务实哲学,以及对“设备智能化”趋势的精准把握。对于开发者和企业,拥抱ARM生态不仅意味着技术兼容,更是参与重塑人机交互未来的入场券。
Try OCI for free at This episode is sponsored by Oracle.
OCI is the next-generation cloud designed for every workload – where you can run any application, including any AI projects, faster and more securely for less.
On average, OCI costs 50% less for compute, 70% less for storage, and 80% less for networking.
Join Modal, Skydance Animation, and today's innovative AI tech companies who upgraded to OCI…and saved.
Why is AI moving from the cloud to our devices, and what makes on device intelligence finally practical at scale?
In this episode of Eye on AI, host Craig Smith speaks with Christopher Bergey, Executive Vice President of Arm's Edge AI Business Unit, about how edge AI is reshaping computing across smartphones, PCs, wearables, cars, and everyday devices.
We explore how Arm v9 enables AI inference at the edge, why heterogeneous computing across CPUs, GPUs, and NPUs matters, and how developers can balance performance, power, memory, and latency.
Learn why memory bandwidth has become the biggest bottleneck for AI, how Arm approaches scalable matrix extensions, and what trade offs exist between accelerators and traditional CPU based AI workloads.
You will also hear real world examples of edge AI in action, from smart cameras and hearing aids to XR devices, robotics, and in car systems.
The conversation looks ahead to a future where intelligence is embedded into everything you use, where AI becomes the default interface, and why reliable, low latency, on device AI is essential for creating experiences users actually trust.
Stay Updated: Craig Smith on X: Eye on A.
I.
on X: