[Click to Download]
The past few years have seen an explosive growth in the use of artificial intelligence (AI) and machine learning (ML) and the market for AI-enabled products continues to grow exponentially. Although AI applications can run on a variety of computing platforms – including microprocessor units (MPUs) and graphics processing units (GPUs) -- tremendous gains in performance accompanied by dramatic reductions in power can be achieved by employing special neural processing units (NPUs).
The problem with using traditional SMARC modules with standalone NPUs is that designers also need to create the associated carrier boards in such a way that they can accommodate add-on cards carrying the NPU. In addition to increasing the size, cost, and power of the resulting system, this approach negatively impacts development time, effort, and resources, reduces time-to-market, and increases risk.The solution is to use a SMARC AIOM (AI-on-Module), which is a SMARC COM that carries both MPU and NPU devices.