Meta Partners with Midjourney

-

In partnership with

Good morning. It’s Monday, August twenty fifth.

On at the present time in tech history: In 1991on comp.os.minix, Linus Torvalds posted about his recent Unix-like operating system, initially a monolithic kernel written in C and x86 assembly with multitasking, virtual memory, and terminal I/O support. This sparked the open-source software revolution and launched the now-ubiquitous Linux kernel that powers every part from servers to smartphones.

  • Meta Partners With Midjourney

  • Grok 2 Goes Open Source

  • Siri to be rebuilt with Gemini?

  • 5 Recent AI Tools

  • Latest AI Research Papers

You read. We listen. Tell us what you think that by replying to this email.

Turn AI Into Your Income Stream

The AI economy is booming, and smart entrepreneurs are already profiting. Subscribe to Minimum Ream and get quick access to 200+ proven strategies to monetize AI tools like ChatGPT, Midjourney, and more. From content creation to automation services, discover actionable ways to construct your AI-powered income. No coding required, just practical strategies that work.

Today’s trending AI news stories

Meta Partners with Midjourney

Meta has just licensed Midjourney’s generative image and video models, bringing the startup’s distinctive “aesthetic technology” into Meta’s future AI stack. Announced by Chief AI Officer Alexandr Wang, the deal complements Meta’s in-house work on Imagine, Movie Gen, and DinoV3 while positioning it to raised compete with OpenAI’s Sora, Google’s Veo, and Black Forest Labs’ Flux.

Midjourney, with 20 million users and a popularity for design-forward outputs, stays independent and community-backed, but its touch may soon shape Instagram tools, VR assets, and even Meta’s widely mocked chatbots. Founder David Holz emphasized continuity for its subscription services, though industry watchers see the deal as Meta’s bid to fuse compute scale with generative creativity.

That compute scale is taking form in Richland Parish, Louisiana, where Meta is investing $10 billion to construct “Hyperion,” a 4 million-square-foot data center designed to deliver as much as 5 GW of AI capability, surpassing any existing site. The nine-building site will feed Meta’s next open-source language models. Powering it means three recent gas plants and 1.5 GW of solar/storage, with Meta committing $3.2B to maintain the lights on. Regulators tout it as a blueprint; critics see a way forward for stranded energy assets if AI efficiency races ahead. Read more.

Grok 2 Goes open-source and Musk’s recent ‘MacroHard’ project bets on an AI-only software stack

Elon Musk’s xAI just sketched its most aggressive play yet. The “Macrohard” project is pitched as a software company made entirely of AI agents. No humans within the loop, no hardware required. The plan is to make use of Grok as a controller to spawn fleets of specialised models for coding, design, speech, video, and testing, then run them inside virtual machines where simulated “AI users” stress-test the outcomes until they’re production ready.

No human dev cycles, no traditional stack – just Colossus, xAI’s Memphis supercomputer scaling on hundreds of thousands of Nvidia GPUs, because the factory floor. Macrohard was trademarked this month with coverage starting from text generation to game design.

At the identical time, xAI has made Grok 2 fully open, releasing its weights on Hugging Face – effectively handing developers a production-grade LLM to dissect, benchmark, or fine-tune. This isn’t a free sandbox: the xAI Community License bans using Grok 2 to bootstrap competing foundation models and requires redistribution to hold the “Powered by xAI” mark. Non-commercial tinkering is wide open, while business deployments must clear xAI’s terms.

Musk also confirmed Grok 2.5 is already open-sourced, with Grok 3 set to follow in six months. Strategically, this move plants xAI firmly within the open-model camp alongside Meta and Mistral, while keeping tight control over competitive spillover. Read more.

Apple bends its AI partitions without breaking them

Bloomberg reports that Apple is exploring Google’s Gemini to power a rebuilt Siri, whilst it tests two internal systems: Linwoodits trillion-parameter in-house LLM, and Glenwooda hybrid setup integrating external providers. The goal is 2026, a yr late after scaling stumbles and leadership churn that saw AI architect Ruoming Pang defect to Meta on a $200M deal. The Gemini talks, and parallel feelers to OpenAI and Anthropic, signal Apple’s recognition that privacy-driven, on-device AI alone won’t close its competitive gap.

That very same pragmatism is shaping Apple’s enterprise push. September’s updates will let IT teams toggle which external models employees can access, starting with ChatGPT Enterprise, while deciding whether data stays in Apple’s Private Cloud Compute or moves to external clouds. The corporate can also be shipping infrastructure upgrades, from Apple Business Manager APIs to streamlined device migrations and Vision Pro fleet support, that underscore a brand new strategy: Apple isn’t surrendering control, but selectively opening gates where leverage matters. Read more.

5 recent AI-powered tools from around the online

arXiv is a free online library where researchers share pre-publication papers.

Your feedback is priceless. Reply to this email and tell us how you think that we could add more value to this text.

Excited by reaching smart readers such as you? To change into an AI Breakfast sponsor, reply to this email or DM us on 𝕏!

ASK ANA

What are your thoughts on this topic?
Let us know in the comments below.

0 0 votes
Article Rating
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments

Share this article

Recent posts

0
Would love your thoughts, please comment.x
()
x