Jua raises $16M to build a foundational AI model for the natural world, starting with the weather

Large AI models — the big troves of language, vision and audio data that power generative artificial intelligence services — are shaping up to be as significant in the development of AI as operating systems have been in the development of smartphones: they are, in a way, looking like the platforms of the space (an […]
© 2024 TechCrunch. All rights reserved. For personal use only.

Large AI models — the big troves of language, vision and audio data that power generative artificial intelligence services — are shaping up to be as significant in the development of AI as operating systems have been in the development of smartphones: they are, in a way, looking like the platforms of the space (an idea others are noodling on, too). Now, a Swiss startup called Jua is using that paradigm with ambitions to build out a new frontier how AI might be used in the physical world. It’s picked up $16 million to build what it is essentially a large “physics” model for the natural world.

The company is still very early stage. Its first application will be in modelling and predicting weather and climate patterns, initially in how they relate to players in the energy industry. This is due to launch by the beginning of next year, the company said. Other industries that it plans to target with its model include agriculture, insurance, transportation and government.

468 Capital and the Green Generation Fund are co-leading this seed round for the Zurich-based startup, with Promus Ventures, Kadmos Capital, Flix Mobility founders, Session.vc, Virtus Resources Partners, Notion.vc and InnoSuisse also participating.

Andreas Brenner, Jua’s CEO who co-founded the company with CTO Marvin Gabler, says that the increasing “volatility” of climate change and geopolitics have led to a need among organizations that work in the physical world — whether in industrial areas like energy or agriculture or something else — to have more accurate modelling and forecasting. 2023 was a high watermark year for climate disasters, according to the U.S. National Centers for Environmental Information, resulting in tens of billions of dollars in damage: it’s this current state of affairs that is driving organizations to have been planning tools in place, not to mention better predictive tools for market analysts and others that use that data.

This is, in a way, not a new problem — nor even a problem that technologists have not already been addressing with AI.

Google’s DeepMind division has built GraphCast; Nvidia has FourCastNet; Huawei has Pangu, which last year saw launched a weather component that saw a flurry of interest. There are also projects underway building AI models out of weather data to hone in on other natural occurrences, as highlighted just last week in this report about a team trying to bring new understanding to bird migration patterns.

Jua’s response to that is twofold. First, it believes that its model is better than these others, in part because it is ingesting more information and is larger — by a multiple of 20x over GraphCast, it claims. Second, weather is just the starting point for considering a wider set of physical questions and answers, and challenges.

“Businesses must improve their capabilities to respond to all this [climate] volatility,” he said. “So in the short term, that is the problem we are solving. But looking into the future, we are building the first foundational model for the natural world… We’re essentially building a machine model that is learning physics… and that is one of the key pillars for achieving artificial general intelligence because just understanding language isn’t enough.”

The company has yet to launch its first products, but the leap of faith that investors are taking is not just couched in hype for all things AI.

Before Jua, Gabler headed up research at Q.met, a longtime player in weather forecasting; and he also worked on deep learning technology for the German government. Brenner has worked in the energy sector and previously founded a fleet management software startup. Taken together those experiences bridge not just technical awareness of the problems and potential solutions, but also first-hand understanding of how industry experiences this.

It’s also showing some early work to investors and would-be customers, getting their input on data, as it continues to develop the product.

One aim seems to be to take a new approach to the concept of what goes into the predictive models. When building a weather predicting model, for example, Brenner said that “using weather stations is pretty obvious.” But in addition to that, it’s ingesting what he describes as “much more noisy data” including recent satellite imagery and topography and other “more novel, recent data” to build its models. “The key difference is we are building this end-to-end system where all of the data that used to be used in different steps of the value chain, is now all brought into the same pool,” he explained. The company said that it has around 5 petabytes (5,000 terabytes) of training data, versus some 45 terabytes for GPT3 and (reportedly) 1 petabyte for GPT4. (Understand that language data may well need less data than a physical world model, though.)

Another aim — not a small one — is that the company is trying to build something more efficient to bring down operational costs for itself and for customers. “Our system uses 10,000 times less compute than the legacy systems,” Brenner said.

It’s notable that Jua is emerging and getting funding at this moment in particular.

Foundational models are shaping up to be the cornerstone of how the next generation of AI applications are being developed, so the companies that are building and controlling foundational models hold a lot of value and potential power.

The biggest movers and shakers among in this area right now are companies like OpenAI, Google, Microsoft, Anthropic, Amazon, and Meta: all U.S. businesses. That has spurred some activity in other parts of the world, such as Europe, to seek out and fund home champions as alternatives. Notably, 468 Capital also backs Germany’s Aleph Alpha, which — like the foundational model players in the U.S. — is also building Large Language Models, but seemingly in closer collaboration with potential customers. (One of its taglines is “Sovereignty in the AI era”).

“Andreas, Marvin and the team are building the world’s first foundation AI for physics and the natural world, which will be capable of providing powerful insights for a wide range of industries dependent on true understanding of nature, from insurance companies and chemical and energy providers, to disaster planning teams, organisations in agriculture, airlines and aid charities,” said Ludwig Ensthaler, a general partner at the firm, in a statement.

There is a definite “good guy” feel about an AI company that is setting out to make better sense of how climate change is impacting us, to aid in better disaster planning, and perhaps even, one day, be used to help understand how to mitigate environment damage. And the bigger picture for a startup aiming to build an AI that can understand the physical world is that, potentially, that can be applied to a much wider set of challenges in material science, biomedicine, chemistry and much more. In addition to the feasibility of the model itself, though, the prospect also carries a lot of questions, similar to those facing other kinds of AI models, around safety, reliability and more, something Jua is already thinking about, even if in rudimentary terms for now.

“In order for models to work and to be accepted, you need to enforce consistency,” said Gabler. “You need to make sure the models actually learn physics from ground up to solve problems correctly.”

 


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *