Wayne Williams reports via TechRadar: Qualcomm announced AI Hub, a comprehensive library of pre-optimized AI models available for devices running on the Snapdragon and Qualcomm platforms. These models support a wide range of applications, including natural language processing, computer vision, and anomaly detection, and are designed to deliver high performance with minimal power consumption, which is critical for mobile and edge devices. The AI Hub library currently includes over 75 popular and generative AI models, including Whisper, ControlNet, Stable Diffusion, and Baichuan 7B. All models are bundled with different runtimes and are optimized to take advantage of Qualcomm AI Engine's hardware acceleration across all cores (NPU, CPU, GPU). Qualcomm says inference time will be four times faster.
AI Hub also automatically handles model conversion from source frameworks to popular runtimes. Works directly with the Qualcomm AI Engine Direct SDK to apply hardware-enabled optimizations. Developers can search, download, and integrate models into their applications based on their needs, saving time and resources. AI Hub also provides tools and resources for developers to customize these models, which can be fine-tuned using his Qualcomm Neural Processing SDK and AI Model Efficiency Toolkit available on the platform.