Chatbots are the craze right now, with usage up to millions of times a day in just over a year. Now, thanks to NVIDIA, you can have your chatbot on your PC. Of course, this is assuming you have a PC with a 30xx series or higher GPU with at least 8 gigabytes of VRAM, so that does eliminate some entry-level laptops.
Chat with RTX uses retrieval augmented generation (RAG) NVIDIA TensorRT-LLM software and RTX accelerator to bring generative AI to the local Windows PC. You can quickly connect to your local files. You can also connect to them as a datasheet and send them to a larger language model, such as Mistral or Llama 2, to enable contextual queries.
Rather than searching through notes or saved content, users can simply type queries. For example, one could ask, “What was the restaurant my partner recommended while in Las Vegas?” and Chat with RTX will scan local files the user points it to and provide the answer with context.
The tool supports various file formats, including .txt, .pdf, .doc/.docx and .xml. Point the application at the folder containing these files, and the tool will load them into its library in just seconds.
You can also include information from YouTube videos and playlists. Adding a video URL adds its content to the contextual query.
Chat RTX is available now as a tech demo and a free download.