Nvidia expands the local chatbot ChatRTX: more LLMs, photos and voice
May 2, 2024
0
Nvidia is making its locally running alternative to chatbots available with a major upgrade. ChatRTX will support more LLMs and process photos more easily. Nvidia expands the capabilities
Nvidia is making its locally running alternative to chatbots available with a major upgrade. ChatRTX will support more LLMs and process photos more easily.
Nvidia expands the capabilities of ChatRTX. The application now supports more LLMs and language and can handle your photos better. ChatRTX is an interesting alternative to classic generative AI chatbots because the application runs locally. Anyone with a powerful enough PC (read: equipped with a robust Nvidia GPU) can communicate with their own data via an LLM without any data or queries going to the cloud.
Voice and photos
ChatRTX supports LLMs from Google, ChatGLM3 and Gemma. Mistral and Llama 2 were already possible. The tool works with your own local data so you can ask questions about it. Thanks to Whisper’s new support, this no longer has to be done in writing. This converts spoken questions into queries. ChatRTX now supports multiple languages.
ChatRTX can also process your photos better. OpenAI’s Contrastive Language-Image Pre-Training integration ensures you can instantly search your images without the need for metadata labeling.
ChatRTX was previously called Chat with RTX. Nvidia continues to call the tool a demo. This demo shows what the possible future of the AI PC could look like, bringing very powerful features to the local computer without the need for cloud intervention.
As an experienced journalist and author, Mary has been reporting on the latest news and trends for over 5 years. With a passion for uncovering the stories behind the headlines, Mary has earned a reputation as a trusted voice in the world of journalism. Her writing style is insightful, engaging and thought-provoking, as she takes a deep dive into the most pressing issues of our time.