Following NVIDIA's Chat with RTX launch, AMD is now offering users their very own localized & GPT-based LLM-powered AI chatbot which can run on Ryzen AI CPUs & Radeon 7000 GPUs.
Last month, NVIDIA launched its "Chat with RTX" AI Chatbot which is available across its RTX 40 & RTX 30 GPUs & is accelerated with the TensorRT-LLM feature set which offers faster GenAI results based on the data you make available to it from your PC or in other terms, a localized dataset. Now AMD is offering its own LLM-based GPT chatbot which can run on a diverse range of hardware such as the Ryzen AI PCs which include Ryzen 7000 & Ryzen 8000 APUs featuring the XDNA NPUs along with the latest Radeon 7000 GPUs which feature AI accelerator cores.
AMD has published a blog where it provides a setup guide on how to utilize its hardware to run your very own localized chatbot powered by GPT-based LLMs (Large Language Models). For AMD Ryzen AI CPUs, you can get the standard LM Studio copy for Windows while the Radeon RX 7000 GPUs get a ROCm Technical Preview. The full guide is shared below:
Having a localized chatbot powered by AI can make life and work relatively easier if set up properly. You can be efficient in doing your work and get proper results based on your queries and the data path the LLM is targeted at. NVIDIA and AMD are accelerating the pace of AI-powered features for consumer-tier hardware and this is just the start, expect more innovations down the road as the AI PC segment reaches new heights.
Read more on wccftech.com