Close Menu
CrypThing
  • Directory
  • News
    • AI
    • Press Release
    • Altcoins
    • Memecoins
  • Analysis
  • Price Watch
  • Price Prediction
Facebook X (Twitter) Instagram Threads
CrypThingCrypThing
  • Directory
  • News
    • AI
    • Press Release
    • Altcoins
    • Memecoins
  • Analysis
  • Price Watch
  • Price Prediction
CrypThing
Home»AI»AI may not need massive training data after all
AI

AI may not need massive training data after all

adminBy adminJanuary 5, 20263 Mins Read
Share Facebook Twitter Pinterest LinkedIn Tumblr Email Copy Link Bluesky Reddit Telegram WhatsApp Threads
AI may not need massive training data after all
Share
Facebook Twitter Email Copy Link Bluesky Reddit Telegram WhatsApp

New research from Johns Hopkins University shows that artificial intelligence systems built with designs inspired by biology can begin to resemble human brain activity even before they are trained on any data. The study suggests that how AI is structured may be just as important as how much data it processes.

The findings, published in Nature Machine Intelligence, challenge the dominant strategy in AI development. Instead of relying on months of training, enormous datasets, and vast computing power, the research highlights the value of starting with a brain-like architectural foundation.

Rethinking the Data Heavy Approach to AI

“The way that the AI field is moving right now is to throw a bunch of data at the models and build compute resources the size of small cities. That requires spending hundreds of billions of dollars. Meanwhile, humans learn to see using very little data,” said lead author Mick Bonner, assistant professor of cognitive science at Johns Hopkins University. “Evolution may have converged on this design for a good reason. Our work suggests that architectural designs that are more brain-like put the AI systems in a very advantageous starting point.”

Bonner and his colleagues aimed to test whether architecture alone could give AI systems a more human-like starting point, without relying on large-scale training.

Comparing Popular AI Architectures

The research team focused on three major types of neural network designs commonly used in modern AI systems: transformers, fully connected networks, and convolutional neural networks.

They repeatedly adjusted these designs to create dozens of different artificial neural networks. None of the models were trained beforehand. The researchers then showed the untrained systems images of objects, people, and animals and compared their internal activity to brain responses from humans and non-human primates viewing the same images.

Why Convolutional Networks Stood Out

Increasing the number of artificial neurons in transformers and fully connected networks produced little meaningful change. However, similar adjustments to convolutional neural networks led to activity patterns that more closely matched those seen in the human brain.

According to the researchers, these untrained convolutional models performed on par with traditional AI systems that typically require exposure to millions or even billions of images. The results suggest that architecture plays a larger role in shaping brain-like behavior than previously believed.

A Faster Path to Smarter AI

“If training on massive data is really the crucial factor, then there should be no way of getting to brain-like AI systems through architectural modifications alone,” Bonner said. “This means that by starting with the right blueprint, and perhaps incorporating other insights from biology, we may be able to dramatically accelerate learning in AI systems.”

The team is now exploring simple learning methods inspired by biology that could lead to a new generation of deep learning frameworks, potentially making AI systems faster, more efficient, and less dependent on massive datasets.

2025 AI Data massive October 27-29 San Francisco Techcrunch event TechCrunch|BProud training Trumps
Share. Facebook Twitter Pinterest LinkedIn Tumblr Telegram Email Copy Link Bluesky WhatsApp Threads
Previous ArticleAre you preparing for your comeback?
Next Article Taisu Ventures And Keio FinTEK Center Launch Keio ChainHack 2026 Focused On Web3 Innovation
admin

Related Posts

The multibillion-dollar AI security problem enterprises can’t ignore 

January 14, 2026

This AI spots dangerous blood cells doctors often miss

January 13, 2026

Google’s Gemini to power Apple’s AI features like Siri

January 12, 2026
Trending News

10 Best Altcoin Prop Trading Firms 2025

November 19, 2025

$3.4 million Bitcoin? Arthur Hayes thinks it's coming

September 24, 2025

AAVE Price Prediction: Breaking $340 Resistance Could Drive AAVE to $385 by October 2025

September 2, 2025

Peter Thiel-backed exchange Bullish targets $4.2 billion valuation, plans to convert IPO proceeds into stablecoins

August 4, 2025
About Us

At crypthing, we’re passionate about making the crypto world easier to (under)stand- and we believe everyone should feel welcome while doing it. Whether you're an experienced trader, a blockchain developer, or just getting started, we're here to share clear, reliable, and up-to-date information to help you grow.

Don't Miss

Reporters found that Zerebro founder was alive and inhaling his mother and father’ home, confirming that the suicide was staged

May 9, 2025

Openai launches initiatives to spread democratic AI through global partnerships

May 9, 2025

Stripe announces AI Foundation model for payments and introduces deeper Stablecoin integration

May 9, 2025
Top Posts

10 Best Altcoin Prop Trading Firms 2025

November 19, 2025

$3.4 million Bitcoin? Arthur Hayes thinks it's coming

September 24, 2025

AAVE Price Prediction: Breaking $340 Resistance Could Drive AAVE to $385 by October 2025

September 2, 2025
  • About Us
  • Privacy Policy
  • Terms and Conditions
  • Disclaimer
© 2026 crypthing. All Rights Reserved.

Type above and press Enter to search. Press Esc to cancel.