Best buy guide: Galaxy Watch 6 or Galaxy S24+. Woo-hoo join SamMobile on WhatsApp or Telegram!

SamMobile has affiliate and sponsored partnerships. If you buy something through one of these links, we may earn a commission.

Notifications
    News for you

    Nvidia Chat with RTX lets you run LLMs locally on RTX 30 and 40 series GPUs

    Tech
    By 

    Last updated: February 15th, 2024 at 20:45 UTC+01:00

    Nvidia has released the Chat with RTX application for Windows. It can run two large language models (LLMs), Llama 2 and Mistral, right on RTX 30 series and RTX 40 series graphics cards and answer your queries about the content, such as documents, photos, and videos, that you connect to the app. It can also load transcriptions of a YouTube video and using that, it can answer your queries about the video.

    Basically, Chat with RTX allows you to run LLMs locally on your PC without the need for the internet, which, according to Nvidia, should get you “fast and secure results.” The company says that it is a “demo” app, and by that, we assume that the brand means that it is a beta version of the app. Fortunately, the app is available for free. Its size is 35GB and you can download it right from Nvidia’s official website.

    Chat with RTX is the first app of its kind and it makes LLMs more accessible and opens up a whole new world of possibilities. Nvidia could roll out many other features to the app and make it useful in ways unimaginable. It has been only one and a half years since OpenAI launched ChatGPT, and we are already seeing so many innovations in the world of GenAI. The future of GenAI is going to be very interesting.

    Tech Nvidia Buy Samsung Galaxy Book 3

    You might also like

    Samsung bags a crucial order from NVIDIA for AI chips

    Samsung bags a crucial order from NVIDIA for AI chips

    Samsung's looking to increase its revenues and profits from the semiconductor division after a couple of years of downturn caused by the supply glut in the memory market. It's vital that the company win significant orders that help it achieve this objective. Samsung has been focusing on its AI chip solutions to get orders from […]

    • By Adnan Farooqui
    • 5 days ago
    Samsung likely sole supplier of advanced HBM3E chips to NVIDIA in 2024

    Samsung likely sole supplier of advanced HBM3E chips to NVIDIA in 2024

    NVIDIA needs high-bandwidth advanced memory chips, and a lot of them, since the astronomical demand for its AI semiconductors is only getting higher. Samsung has launched some powerful products to support that demand, and its focus on this segment of the memory semiconductor market might make it only supplier of HBM3E modules to NVIDIA this […]

    • By Adnan Farooqui
    • 2 weeks ago
    NVIDIA CEO signs his approval on Samsung’s HBM3E memory, literally

    NVIDIA CEO signs his approval on Samsung’s HBM3E memory, literally

    Samsung's HBM3E 12H advanced memory chips have received high praise from NVIDIA CEO Jensen Huang recently. He confirmed that the company is testing these new memory models for integration into its GPUs, as unrelenting demand continues for its products. Such is the nature of the close collaboration between these two titans of the industry that […]

    • By Adnan Farooqui
    • 3 weeks ago
    Samsung snatches a $752 million AI chip order away from NVIDIA

    Samsung snatches a $752 million AI chip order away from NVIDIA

    Some details about Mach-1, Samsung's first in-house AI accelerator chip, emerged earlier this week. Samsung's aim with this product is to break NVIDIA's grip on the AI accelerator market and establish itself as one of the dominant players in this segment. It seems that at least one company has been convinced by what Samsung is […]

    • By Adnan Farooqui
    • 3 weeks ago
    Amid AI boom, NVIDIA is loving Samsung’s advanced memory chips

    Amid AI boom, NVIDIA is loving Samsung’s advanced memory chips

    NVIDIA has emerged as the leading supplier of AI semiconductor solutions across the globe. There's incredible demand for its products as companies jump on the AI bandwagon with new solutions and services. AI semiconductors require advanced memory chips and Samsung finds itself in a good position to cash in on this boom. It has a […]

    • By Adnan Farooqui
    • 3 weeks ago
    Samsung’s going to launch an ‘SSD Subscription’ model for servers

    Samsung’s going to launch an ‘SSD Subscription’ model for servers

    You've heard of TV and music streaming subscriptions, but how about one for high-performance SSD memory solutions for servers? That's apparently what Samsung is going to announce tomorrow at NVIDIA's GTC 2024 conference in San Jose, California. Samsung is reportedly going to introduce several new memory solutions for AI and machine learning applications at the […]

    • By Adnan Farooqui
    • 4 weeks ago