Cadastre-se agora para um orçamento mais personalizado!

Nvidia's new AI chatbot runs locally on your PC, and it's free

Feb, 14, 2024 Hi-network.com
Nvidia's GeForce RTX 40 series GPU

Nvidia's GeForce RTX 40 series GPU.

Nvidia

Nvidia has released a demo version of a new AI chatbot that runs locally on certain PCs with GeForce RTX. The demo app, called Chat with RTX, is free to download and enables users to run an AI chatbot on their PCs, personalized with their content. 

Also: The best AI chatbots: ChatGPT and other noteworthy alternatives

Powered by Nvidia TensorRT-LLM software, the app can generate content and can be trained on a user selection of content, including .txt, .pdf, .doc/.docx, and .xml files and even URLs to YouTube videos. 

After choosing content to train the bot, users can ask it personalized questions about the content they've supplied. For example, the bot can summarize the step-by-step instructions in a how-to video from YouTube or tell a user which type of batteries they had in their shopping list.

Training the bot on the user's preferred content makes the experience truly personalized, but the fact that it all happens locally keeps user data private. Chat with RTX can return fast responses while keeping all user information secure because it doesn't rely on cloud-based services, which means it can also run without an internet connection.

Also: ChatGPT vs. Copilot: Which AI chatbot is better for you?

The bot requires an Nvidia GeForce RTX 30 Series GPU or higher, with at least 8 GB of VRAM. Chat with RTX also requires Windows 10 or 11 and the latest Nvidia GPU drivers.

Nvidia says its TensorRT-LLM software, combined with retrieval-augmented generation (RAG) and RTX acceleration, allows Chat with RTX to provide relevant answers by using local files as a dataset and connecting them to open-source LLMs like Mistral and Llama 2. 

tag-icon Tags quentes : Inovação

Copyright © 2014-2024 Hi-Network.com | HAILIAN TECHNOLOGY CO., LIMITED | All Rights Reserved.