Scientists at the University of Geneva have bridged the gap in AI by creating an artificial neural network that learns a task before communicating with another AI that can replicate the task.
Humans can grasp new tasks through brief instructions and express the learned task clearly enough for others to replicate it. It is essential to human communication and is a key feature of our conscious world.
this fascinating studyExplain in detail natural neuroscienceAI gives us a form of human communication and learning that has long eluded technology.
The project, led by UNIGE School of Medicine professor Alexandre Pouget with his team, explores advanced techniques within natural language processing, a subset of AI that focuses on machine understanding and responding to human language.
Pouget explains the current limitations of AI in this context. article Posted on the University of Geneva website: “Conversational agents using AI today can integrate linguistic information to generate text or images. “But as far as we know, they are not yet able to translate verbal or written instructions into sensorimotor actions, or even explain them to other artificial intelligences so that they can reproduce them.”
The Geneva team improved upon S-Bert, an existing artificial neural model (ANN) for language understanding.
They connected S-Bert to a smaller, simpler network to simulate Wernicke and Broca’s areas, areas of speech perception and production in the human brain.
With training, this network can execute tasks based on written English instructions and then verbally communicate these tasks to its “sister” network, allowing the two AIs to communicate task instructions purely through language.
Reidar Riveland, Ph.D. “We started with S-Bert, an existing artificial neuron model with 300 million neurons and pre-trained to understand language,” explained a student who participated in the study. “We ‘connected’ it to another simple network made up of thousands of neurons.”
Tasks ranged from simple instructions, such as pointing to a location, to more complex instructions that required identifying subtle contrasts between visual stimuli.
Key findings from the research include:
- AI systems can understand and execute commands and perform new, unseen tasks correctly 83% of the time, based solely on verbal commands.
- The system can generate descriptions of learned tasks in a way that allows a second AI to understand and replicate these tasks with a similar success rate.
This opens up new opportunities in robotics by further developing the potential for AI models to learn and communicate tasks linguistically.
It integrates verbal comprehension and sensorimotor skills. This means that the AI can talk and understand when commands ask it to perform tasks, such as picking up an object from a shelf or moving in a certain direction.
“The network we developed is very small. “Now, nothing stands in the way of building on this foundation to develop even more complex networks that will be integrated into humanoid robots that can not only understand us, but also understand each other,” the researchers shared about their study.
With recent significant investments in AI robotics companies like Figure AI, intelligent androids may be closer to reality than we think.