Most of the investment buzz in AI hardware concentrates on the amazing accelerator chips that crunch the math required for neural networks, like Nvidia’s GPUs. But what about the rest of the story? CPUs and NICs that pre- and post-process the query add significant costs and are not designed for AI; they are general-purpose devices and can cost tens of thousands of dollars per server. What if someone re-imagined those servers with a clean-sheet design to efficiently handle the AI task at hand?
Read more here.