Good morning. It’s Monday, September fifteenth.
On this present day in tech history: In 2011, PLOS ONE published MultiMiTar, an early machine-learning tool for predicting microRNA targets using feature engineering and multi-objective optimization. It’s a neat snapshot of how bioinformatics tackled messy data with handcrafted ML pipelines before deep learning took over.
-
ChatGPT Prepares Native Ordering System
-
Google’s VaultGemma sets privacy milestone
-
Speed over nuance, specialists over generalists: xAI’s latest AI play
-
5 Latest AI Tools
-
Latest AI Research Papers
You read. We listen. Tell us what you think that by replying to this email.
How 433 Investors Unlocked 400X Return Potential
Institutional investors back startups to unlock outsized returns. Regular investors need to wait. But not anymore. Because of regulatory updates, some corporations are doing things otherwise.
Take Revolut. In 2016, 433 regular people invested a median of $2,730. Today? They got a 400X buyout offer from the corporate, as Revolut’s valuation increased 89,900% in the identical timeframe.
Founded by a former Zillow exec, Pacaso’s co-ownership tech reshapes the $1.3T vacation home market. They’ve earned $110M+ in gross profit so far, including 41% YoY growth in 2024 alone. They even reserved the Nasdaq ticker PCSO.
Paid commercial for Pacaso’s Regulation A offering. Read the offering circular at invest.pacaso.com. Reserving a ticker symbol is just not a guarantee that the corporate will go public. Listing on the NASDAQ is subject to approvals.

Today’s trending AI news stories
ChatGPT prepares native ordering system, Altman’s biotech startup learns to hack aging
OpenAI has a brand new Orders hub in development, designed to store bank cards, enable one-click checkout, and track purchases across desktop and mobile. Voice Mode is being folded directly into the primary app window for tighter multimodal workflows, and parental controls on the net client hint at future deployments in classrooms and youth-focused contexts. Rollout details remain unconfirmed, however the groundwork indicates a staged release.

At the identical time, OpenAI is recalibrating its partnership with Microsoft. In line with The InformationMicrosoft’s revenue share is ready to fall from roughly 20% to eight% by 2030, freeing up greater than $50 billion to fund compute-intensive training and inference. Microsoft, in exchange, is predicted to achieve one-third equity in a restructured OpenAI entity, though with out a board seat. This underscores the asymmetry of the connection: hyperscalers provide GPUs and cloud scale, but model corporations retain governance.
On the narrative front, the corporate’s leadership is framing the industry’s turbulence as each essential and inevitable. Board chair Bret Taylor likens the AI surge to the dot-com bubble, excessive within the short term but foundational for trillion-dollar industries.

Joe Betts-LaCroix is the CEO of Retro Biosciences, a long life startup backed with money from OpenAI’s Sam Altman
Sam Altman, meanwhile, flags healthcare as one in every of the few jobs AI can’t touch, whilst he pushes deeper into biotech. His longevity startup, Retro Biosciences, is about to run its first human trial of RTR242, a pill designed to kickstart autophagy, the cell’s recycling process, to clear misfolded proteins tied to Alzheimer’s and potentially reverse neural aging. Retro can be working on blood regeneration (RTR890) and central nervous system repair (RTR888), with a lift from OpenAI’s GPT-4b micro, which has already delivered fifty-fold gains in cellular reprogramming markers. Read more.
Google’s VaultGemma sets privacy milestone as Hassabis tempers AGI hype
DeepMind just dropped VaultGemma, a 1-billion-parameter language model trained end-to-end with differential privacy. Which means it might probably’t memorize and leak sensitive data, because statistical noise is injected during training. Researchers introduced “DP Scaling Laws,” adapting traditional scaling rules to personal regimes without accuracy collapse.
VaultGemma is a 26-layer, decoder-only transformer with multi-query attention and a 1,024-token context length. The model matches the performance of non-private LLMs of comparable size, making it the primary differentially private system that isn’t years behind state-of-the-art. Weights at the moment are available on Hugging Face and Kaggle.

The structure of our DP scaling laws. We establish that predicted loss may be accurately modeled using primarily the model size, iterations and the noise-batch ratio, simplifying the complex interactions between the compute, privacy, and data budgets. | Image: Google
Meanwhile, Demis Hassabis made it clear that today’s AI isn’t “PhD-level intelligence,” dismissing OpenAI’s framing of GPT-5 as misleading. Current models, he argues, still fail at basic arithmetic and lack continual learning, intuitive reasoning, and cross-domain creativity. He pegs AGI at five to 10 years away, requiring breakthroughs beyond brute-force scaling. Speaking in Athens, Hassabis said the defining skill for the subsequent generation won’t be coding but “learning tips on how to learn,” as employees might want to reskill repeatedly.
On the patron front, Google’s Gemini app just knocked ChatGPT off the highest of the US App Store, picking up 23 million latest users in only two weeks. Read more.
Speed over nuance, specialists over generalists: xAI’s latest AI play
xAI has launched Grok 4 Fast, an early access beta model that trades depth for raw speed, delivering answers as much as 10 times faster than Grok 4 by reducing compute spent on deep reasoning. The result’s a leaner model suited to straightforward factual lookups, easy code generation, and quick tool integrations which suggests less useful in case you’re after nuanced evaluation or creative writing.
It’s essentially a “lean mode” LLM, first tested under the codename Sonomaand it could soon replace Grok 3 for free-tier users. A brand new changelog page has also been introduced to offer clearer update transparency, hinting at a push toward more iterative, user-facing development. But speed isn’t the one lever xAI is pulling.

The corporate also laid off about 500 staff, roughly a 3rd of its data annotation team. These were the generalist tutors who sorted and explained raw data to coach Grok. They’ve been swapped out for a planned wave of domain specialists in medicine, finance, and science, which xAI says it should expand “tenfold.” Read more.


arXiv is a free online library where researchers share pre-publication papers.



Your feedback is precious. Reply to this email and tell us how you think that we could add more value to this article.
Involved in reaching smart readers such as you? To change into an AI Breakfast sponsor, reply to this email or DM us on 𝕏!