- Weekly AI News
- Posts
- RouteLLM, Google DeepMind's JEST Method, Apple Joins OpenAI BOD
RouteLLM, Google DeepMind's JEST Method, Apple Joins OpenAI BOD
New AI Training Model JEST, RouteLLM Achieves 90% GPT-4 Quality at 80% Lower Cost, Meta AI develops compact language model for mobile devices, & More....
🌐 AI News
🔜 Up Next:
- July 15-17, 2024 Fortune Brainstorm Tech 2024 (Park City, Utah). Register Now
- July 30-31, 2024 Fortune Brainstorm AI Singapore Register Now
🚀 Top AI Highlights
RouteLLM: Achieves 90% GPT-4 Quality at 80% Lower Cost

RouteLLM, an innovative project from lmsys.org, reduces the cost of running large language models (LLMs) by up to 80% while maintaining 95% of GPT-4’s quality. This open-source framework balances cost and performance using smaller, open-source models managed by an orchestration layer, pushing most computations to local devices like phones and computers.
Key highlights include
Cost Reduction: Achieves significant savings, with 85% cost reduction on MT Bench, 45% on MLU, and 35% on GSM AK, all while maintaining high performance.
Optimized Systems: Uses smaller models and agentic systems for quality, efficiency, cost, privacy, and security.
Efficient Query Handling: Local models handle 90-95% of queries, routing only complex ones to expensive models like GPT-4.
Future-Proof: As LLMs improve, more tasks can be managed locally, reducing overall costs.
RouteLLM outperforms commercial systems like Martian and Unify AI in terms of cost, being over 40% cheaper while delivering similar performance. The framework generalizes well across different model pairs without retraining, making it a valuable tool for efficient LLM deployment. Benefits include lower costs, enhanced AI techniques, and improved efficiency through local edge devices.
Google DeepMind’s JEST Method: 13x Faster & 10x more power efficient

DeepMind's JEST (Joint Example Selection and Trust) significantly enhances AI training efficiency, reducing the number of iterations and computational power required by up to 13 and 10 times, respectively. This innovative method lowers energy consumption, addressing environmental concerns and cutting costs.
Key Features
Batch Data Selection: Unlike traditional methods that focus on individual data points, JEST selects entire batches of data, optimizing the learning process.
Curated Initial Data: It starts with a highly curated dataset to guide the training on larger, messier datasets, ensuring efficiency without compromising quality.
Multimodal Contrastive Learning: JEST examines different types of data together, accelerating the learning process.
Industry Implications
Power Demands: AI workloads are consuming increasing amounts of power. JEST helps mitigate these demands, crucial as AI’s power consumption is projected to grow.
Global Competition: With significant advancements from Chinese AI companies, JEST provides a competitive edge by improving efficiency and reducing costs.
JEST represents a major leap forward in AI training, offering substantial efficiency gains, reduced resource consumption, and addressing both financial and environmental concerns, making it a critical innovation in the evolving AI landscape.
😍 Enjoying so far, share it with your friends!
Apple Joins OpenAI Board Of Directors As An Observer

The partnership between Apple and OpenAI involves Apple getting an observer seat on OpenAI's board, with Phil Schiller, Apple's former marketing chief, chosen for the role. This arrangement does not involve any payment from Apple to OpenAI. The observer seat allows Apple to attend board meetings and receive information about OpenAI's operations, but does not grant voting rights or decision-making power.
As part of the partnership, Apple maintains its own AI development for devices and private cloud. Apple uses OpenAI's API for specific tasks, such as integrating ChatGPT into its devices. However, Apple ensures that user data is not shared with OpenAI without explicit user confirmation. This approach allows Apple to leverage the capabilities of OpenAI's technology while maintaining control over its own AI development and data privacy.
The integration strategy involves Apple using its own AI development for most tasks, while utilizing OpenAI's API for specific functions that require advanced natural language processing capabilities. This hybrid approach allows Apple to maintain control over its AI development and data privacy while still benefiting from the advanced capabilities of OpenAI's technology.
Now the Question Is…??
Given the partnership between Apple and OpenAI, where Apple utilizes OpenAI's API for advanced natural language processing tasks, what is the strategic purpose of Apple's own small language model, OpenELM, designed specifically for on-device operations? Can ChatGPT, as a large language model, truly be integrated into Apple's devices, and how does this impact the role and necessity of OpenELM?
🚀 Tech Glimpse of the Week
- Chip giant TSMC crosses $1 trillion market cap, riding on the back of Nvidia's gains
- Virga: One of world’s most powerful AI supercomputers launched in Australia
- ChatGPT might rule the AI chatbots — but it can't beat Google Search
- Meta AI develops compact language model for mobile devices
- Tencent boosts 100,000 GPU-capable HPC clusters with network optimization
👥 Connect & Feedback!
👉 Join Us:
📧 Advertise In Weekly AI News:
📧 Contact directly at [email protected]
😍 Share with your friends!

Reply