❌

Normal view

Received yesterday β€” 12 December 2025

OpenAI built an AI coding agent and uses it to improve the agent itself

12 December 2025 at 17:16

With the popularity of AI coding tools rising among some software developers, their adoption has begun to touch every aspect of the process, including the improvement of AI coding tools themselves.

In interviews with Ars Technica this week, OpenAI employees revealed the extent to which the company now relies on its own AI coding agent, Codex, to build and improve the development tool. β€œI think the vast majority of Codex is built by Codex, so it’s almost entirely just being used to improve itself,” said Alexander Embiricos, product lead for Codex at OpenAI, in a conversation on Tuesday.

Codex, which OpenAI launched in its modern incarnation as a research preview in May 2025, operates as a cloud-based software engineering agent that can handle tasks like writing features, fixing bugs, and proposing pull requests. The tool runs in sandboxed environments linked to a user’s code repository and can execute multiple tasks in parallel. OpenAI offers Codex through ChatGPT’s web interface, a command-line interface (CLI), and IDE extensions for VS Code, Cursor, and Windsurf.

Read full article

Comments

Β© Mininyx Doodle via Getty Images

Received before yesterday

Google tells employees it must double capacity every 6 months to meet AI demand

21 November 2025 at 16:47

While AI bubble talk fills the air these days, with fears of overinvestment that could pop at any time, something of a contradiction is brewing on the ground: Companies like Google and OpenAI can barely build infrastructure fast enough to fill their AI needs.

During an all-hands meeting earlier this month, Google’s AI infrastructure head Amin Vahdat told employees that the company must double its serving capacity every six months to meet demand for artificial intelligence services, reports CNBC. The comments show a rare look at what Google executives are telling its own employees internally. Vahdat, a vice president at Google Cloud, presented slides to its employees showing the company needs to scale β€œthe next 1000x in 4-5 years.”

While a thousandfold increase in compute capacity sounds ambitious by itself, Vahdat noted some key constraints: Google needs to be able to deliver this increase in capability, compute, and storage networking β€œfor essentially the same cost and increasingly, the same power, the same energy level,” he told employees during the meeting. β€œIt won’t be easy but through collaboration and co-design, we’re going to get there.”

Read full article

Comments

Β© Google

Meta’s star AI scientist Yann LeCun plans to leave for own startup

12 November 2025 at 12:14

Meta’s chief AI scientist and Turing Award winner Yann LeCun plans to leave the company to launch his own startup focused on a different type of AI called β€œworld models,” the Financial Times reported. The French-US scientist has reportedly told associates he will depart in the coming months and is already in early talks to raise funds for the new venture. The departure comes as CEO Mark Zuckerberg radically overhauled Meta’s AI operations after deciding the company had fallen behind rivals such as OpenAI and Google.

World models are hypothetical AI systems that some AI engineers expect to develop an internal β€œunderstanding” of the physical world by learning from video and spatial data rather than text alone. Unlike current large language models (such as the kind that power ChatGPT) that predict the next segment of data in a sequence, world models would ideally simulate cause-and-effect scenarios, understand physics, and enable machines to reason and plan more like animals do. LeCun has said this architecture could take a decade to fully develop.

While some AI experts believe that Transformer-based AI modelsβ€”such as large language models, video synthesis models, and interactive world synthesis modelsβ€”have emergently modeled physics or absorbed the structural rules of the physical world from training data examples, the evidence so far generally points to sophisticated pattern-matching rather than a base understanding of how the physical world actually works.

Read full article

Comments

Β© Photo by Kevin Dietsch/Getty Images

❌