Groq® Is Still Faster
MOUNTAIN VIEW, Calif., March 18, 2024 /PRNewswire/ — Groq®, a generative AI solutions company, responds to NVIDIA GTC keynote: “Still faster.”
About Groq
Groq® is a generative AI solutions company and the creator of the LPU™ Inference Engine, the fastest language processing accelerator on the market. It is architected from the ground up to achieve low latency, energy-efficient, and repeatable inference performance at scale. Customers rely on the LPU Inference Engine as an end-to-end solution for running Large Language Models (LLMs) and other generative AI applications at 10x the speed. Groq Systems powered by the LPU Inference Engine are available for purchase. Customers can also leverage the LPU Inference Engine for experimentation and production-ready applications via an API in GroqCloud™ by purchasing Tokens-as-a-Service. Jonathan Ross, inventor of the Google Tensor Processing Unit (TPU), founded Groq to preserve human agency while building the AI economy. Experience Groq speed for yourself at groq.com.
Media Contact for Groq
Allyson Scott
[email protected]
View original content to download multimedia:https://www.prnewswire.com/news-releases/groq-is-still-faster-302092177.html
SOURCE Groq