Close Menu
TheKhaleejPost
    What's Hot
    Culture

    Oteko Russia Supports All-Russian Ecological Effort

    Business

    Scaling Up: Visionize Technology Boosts Service Operations to Enhance Client Experience

    Business

    Report; Unusual CFDs that you can trade

    Important Pages:
    • Home
    • Privacy Policy
    • Terms & Conditions
    Facebook X (Twitter) Instagram Pinterest
    Facebook X (Twitter) Instagram Pinterest
    TheKhaleejPost
    • Home
    • Technology

      Terra Drone and Aramco Forge Strategic MOU to Advance Drone Innovation and Localization in Saudi Arabia

      Panasonic AG-CX370 4K Camcorder enhances mobility for IP-based live productions

      Ericsson announces changes to the Executive Team and to the Market Area structure

      CNTXT AI Launches TestAI: The GCC’s First AI Readiness Platform to ensure trustworthy and scalable AI Voice Agents

      A1RWATER and Umm Al Emarat Park Partner to Advance UAE’s Sustainability Goals with Innovative Air-to-Water Technology

    • Business

      CPX Holding acquires spiderSilk to drive global cybersecurity expansion

      Fuze closes $12.2 million Series A to power digital asset infrastructure

      Calo expands into UK by acquiring two local companies

      CNTXT AI Unveils Munsit: The Most Accurate Arabic Speech Recognition Model

      Innovative Startups and SMEs Fund (ISSF) Invests USD 3 Million in Antler MENAP Fund

    • Submit A Press Release
    TheKhaleejPost
    Home » Snowflake Teams Up with Meta to Host and Optimize New Flagship Model Family in Snowflake Cortex AI
    Business

    Snowflake Teams Up with Meta to Host and Optimize New Flagship Model Family in Snowflake Cortex AI

    Facebook Twitter Pinterest WhatsApp
    Share
    Facebook Twitter Pinterest WhatsApp

    July, 2024 – Snowflake (NYSE: SNOW), the AI Data Cloud company, today announced that it will host the Llama 3.1 collection of multilingual open source large language models (LLMs) in Snowflake Cortex AI for enterprises to easily harness and build powerful AI applications at scale. This offering includes Meta’s largest and most powerful open source LLM, Llama 3.1 405B, with Snowflake developing and open sourcing the inference system stack to enable real-time, high-throughput inference and further democratize powerful natural language processing and generation applications. Snowflake’s industry-leading AI Research Team has optimized Llama 3.1 405B for both inference and fine-tuning, supporting a massive 128K context window from day one, while enabling real-time inference with up to 3x lower end-to-end latency and 1.4x higher throughput than existing open source solutions. Moreover, it allows for fine-tuning on the massive model using just a single GPU node — eliminating costs and complexity for developers and users — all within Cortex AI.

    By partnering with Meta, Snowflake is providing customers with easy, efficient, and trusted ways to seamlessly access, fine-tune, and deploy Meta’s newest models in the AI Data Cloud, with a comprehensive approach to trust and safety built-in at the foundational level.

    “Snowflake’s world-class AI Research Team is blazing a trail for how enterprises and the open source community can harness state-of-the-art open models like Llama 3.1 405B for inference and fine-tuning in a way that maximizes efficiency,” said Vivek Raghunathan, VP of AI Engineering, Snowflake. “We’re not just bringing Meta’s cutting-edge models directly to our customers through Snowflake Cortex AI. We’re arming enterprises and the AI community with new research and open source code that supports 128K context windows, multi-node inference, pipeline parallelism, 8-bit floating point quantization, and more to advance AI for the broader ecosystem.”

    Snowflake’s Industry-Leading AI Research Team Unlocks the Fastest, Most Memory Efficient Open Source Inference and Fine-Tuning
    Snowflake’s AI Research Team continues to push the boundaries of open source innovations through its regular contributions to the AI community and transparency around how it is building cutting-edge LLM technologies. In tandem with the launch of Llama 3.1 405B, Snowflake’s AI Research Team is now open sourcing its Massive LLM Inference and Fine-Tuning System Optimization Stack in collaboration with DeepSpeed, Hugging Face, vLLM, and the broader AI community. This breakthrough establishes a new state-of-the-art for open source inference and fine-tuning systems for multi-hundred billion parameter models.

    Massive model scale and memory requirements pose significant challenges for users aiming to achieve low-latency inference for real-time use cases, high throughput for cost effectiveness, and long context support for various enterprise-grade generative AI use cases. The memory requirements of storing model and activation states also make fine-tuning extremely challenging, with the large GPU clusters required to fit the model states for training often inaccessible to data scientists.

    Snowflake’s Massive LLM Inference and Fine-Tuning System Optimization Stack addresses these challenges. By using advanced parallelism techniques and memory optimizations, Snowflake enables fast and efficient AI processing, without needing complex and expensive infrastructure. For Llama 3.1 405B, Snowflake’s system stack delivers real-time, high-throughput performance on just a single GPU node and supports a massive 128k context windows across multi-node setups. This flexibility extends to both next-generation and legacy hardware, making it accessible to a broader range of businesses. Moreover, data scientists can fine-tune Llama 3.1 405B using mixed precision techniques on fewer GPUs, eliminating the need for large GPU clusters. As a result, organizations can adapt and deploy powerful enterprise-grade generative AI applications easily, efficiently, and safely.

    Snowflake’s AI Research Team has also developed optimized infrastructure for fine-tuning inclusive of model distillation, safety guardrails, retrieval augmented generation (RAG), and synthetic data generation so that enterprises can easily get started with these use cases within Cortex AI.

    Snowflake Cortex AI Furthers Commitment to Delivering Trustworthy, Responsible AI
    AI safety is of the utmost importance to Snowflake and its customers. As a result, Snowflake is making Snowflake Cortex Guard generally available to further safeguard against harmful content for any LLM application or asset built in Cortex AI — either using Meta’s latest models, or the LLMs available from other leading providers including AI21 Labs, Google, Mistral AI, Reka, and Snowflake itself. Cortex Guard leverages Meta’s Llama Guard 2, further unlocking trusted AI for enterprises so they can ensure that the models they’re using are safe.

    Comments on the News from Snowflake Customers and Partners
    “As a leader in the hospitality industry, we rely on generative AI to deeply understand and quantify key topics within our Voice of the Customer platform. Gaining access to Meta’s industry-leading Llama models within Snowflake Cortex AI empowers us to further talk to our data, and glean the necessary insights we need to move the needle for our business,” said Dave Lindley, Sr. Director of Data Products, E15 Group. “We’re looking forward to fine-tuning and testing Llama to drive real-time action in our operations based on live guest feedback.”

    “Safety and trust are a business imperative when it comes to harnessing generative AI, and Snowflake provides us with the assurances we need to innovate and leverage industry-leading large language models at scale,” said Ryan Klapper, an AI leader at Hakkoda. “The powerful combination of Meta’s Llama models within Snowflake Cortex AI unlocks even more opportunities for us to service internal RAG-based applications. These applications empower our stakeholders to interact seamlessly with comprehensive internal knowledge bases, ensuring they have access to accurate and relevant information whenever needed.”

    “By harnessing Meta’s Llama models within Snowflake Cortex AI, we’re giving our customers access to the latest open source LLMs,” said Matthew Scullion, Matillion CEO and co-founder. “The upcoming addition of Llama 3.1 gives our team and users even more choice and flexibility to access the large language models that suit use cases best, and stay on the cutting-edge of AI innovation. Llama 3.1 within Snowflake Cortex AI will be immediately available with Matillion on Snowflake’s launch day.”

    “As a leader in the customer engagement and customer data platform space, Twilio’s customers need access to the right data to create the right message for the right audience at the right time,” said Kevin Niparko VP, Product and Technology Strategy, Twilio Segment. “The ability to choose the right model for their use case within Snowflake Cortex AI empowers our joint customers to generate AI-driven, intelligent insights and easily activate them in downstream tools. In an era of rapid evolution, businesses need to iterate quickly on unified data sets to drive the best outcomes.”

    Share. Facebook Twitter Pinterest WhatsApp
    Previous ArticleCinnamon Hotels and Resorts unveils the unforgettable Maldivian experience through the chain’s first-ever best rate guaranteed promise
    Next Article Over 34,000 new companies become members of Dubai Chamber of Commerce during H1 2024,representing YoY growth of 5%

    Related Posts

    Business

    CPX Holding acquires spiderSilk to drive global cybersecurity expansion

    Culture

    SHARJAH SCHOOL TEAM CLINCHES VICTORY IN GREENPOWER ENDURANCE RACE

    Business

    Fuze closes $12.2 million Series A to power digital asset infrastructure

    Business

    Calo expands into UK by acquiring two local companies

    Business

    CNTXT AI Unveils Munsit: The Most Accurate Arabic Speech Recognition Model

    Business

    Innovative Startups and SMEs Fund (ISSF) Invests USD 3 Million in Antler MENAP Fund

    Culture

    Earth Day Mangrove Planting with EEG & UAQ Municipality

    Culture

    A.R.M. Holding Children’s Programme launches fifth edition on World Creativity and Innovation Day

    Add A Comment
    Leave A Reply Cancel Reply

    Stay In Touch
    • Facebook
    • Twitter
    Top Picks
    Culture

    Sheikh Saif bin Zayed attends Graduation of the Class of 2024 at Zayed University and celebrates ZU’s Future Makers

    Under the patronage and presence of Lt. General His Highness Sheikh Saif bin Zayed Al…

    Panasonic Announces FY24 Business Strategy For Sustained Growth in Middle East and Africa 

    Thales modernizes Bahrain’s Airspace with TopSky Aeronautical Messaging Handling System

    DIFC Announces Opening Of “Link Bridge” To Enhance Connectivity Of DIFC

    GO Sport Expands to Palestine

    Stay ahead with TheKhaleejPost, your premier destination for breaking news, analysis, and in-depth coverage across the Gulf region and beyond. Discover diverse viewpoints, stay informed, and delve into the stories shaping our communities and world.

    Facebook X (Twitter)
    Categories
    • Business (267)
      • Career (1)
      • Management (5)
      • Startup (10)
    • Culture (64)
    • Entertainment (29)
      • Celebrity (3)
      • Movies (2)
      • TV Shows (1)
    • News (62)
    • Politics (12)
    • Technology (56)
      • Phones (2)
      • Science (1)
    • Videos (10)
    Our Picks
    Business

    Storyderm Launches the Renewed Gold Micro Needle Therapy “Princess Peel”

    Business

    Tunisian fintech EasyBank secures $370,000 investment

    Technology

    MENA’s leading gaming and esports agency, Power League Gaming, launches “Flux” to help brands enter the fast growing space with lower cost solutions

    © 2025 TheKhaleejPost.
    • Home
    • About Us
    • Contact us
    • Privacy Policy
    • Terms & Conditions

    Type above and press Enter to search. Press Esc to cancel.