Tag Archives: artificial intelligence

Scientists create creepy lifelike faces with real human skin for robots

Future of robotics could include living skin for humanoid machines

By Kurt Knutsson, CyberGuy Report Fox News

Researchers from the University of Tokyo have developed a groundbreaking method to cover robotic surfaces with genuine, living skin tissue. The idea of robots with skin isn’t just about creating a more lifelike appearance. This innovation opens up a world of possibilities, from more realistic prosthetics to robots that can seamlessly blend into human spaces. 

As we delve into the details of this research, we’ll uncover how these scientists are bridging the gap between artificial and biological systems, potentially revolutionizing fields ranging from health care to human-robot interaction.

Scientists create creepy lifelike faces with real human skin for robots

Engineered skin tissue (Shoji Takeuchi’s research group at the University of Tokyo) (Kurt “CyberGuy” Knutsson)

What’s the big deal?

We’re talking about robots that not only look human-like but also have skin that can heal, sweat and even tan. This isn’t just about aesthetics; it’s about creating robots that can interact more naturally with humans and their environment.

Scientists create creepy lifelike faces with real human skin for robots

Illustration of the tissue-fixation method (Shoji Takeuchi’s research group at the University of Tokyo) (Kurt “CyberGuy” Knutsson)

How does it work?

The secret lies in something called “perforation-type anchors.” These clever little structures are inspired by the way our own skin attaches to the tissues underneath. Essentially, they allow living tissue to grow into and around the robot’s surface, creating a secure bond.

The researchers used a combination of human dermal fibroblasts and human epidermal keratinocytes to create this living skin. They cultured these cells in a carefully prepared mixture of collagen and growth media, allowing the tissue to mature and form a structure similar to human skin.

Scientists create creepy lifelike faces with real human skin for robots

Evaluation of the perforation-type anchors to hold tissue (Shoji Takeuchi’s research group at the University of Tokyo) (Kurt “CyberGuy” Knutsson)

The minds behind the innovation

This groundbreaking research was conducted at the Biohybrid Systems Laboratory at the University of Tokyo, led by Professor Shoji Takeuchi. The team’s work is pushing the boundaries of what’s possible in robotics and bioengineering.

Scientists create creepy lifelike faces with real human skin for robots

Demonstration of the perforation-type anchors to cover the facial device (Shoji Takeuchi’s research group at the University of Tokyo) (Kurt “CyberGuy” Knutsson)

Building a face that can smile

One of the coolest demonstrations of this technology is a robotic face covered with living tissue that can actually smile. The researchers created a system where the skin-covered surface can be moved to mimic facial expressions.

To achieve this, they designed a robotic face with multiple parts, including a base with perforation-type anchors for both a silicone layer and the dermis equivalent. This silicone layer mimics subcutaneous tissue, contributing to a more realistic smiling expression.

Scientists create creepy lifelike faces with real human skin for robots

The smiling robotic face (Shoji Takeuchi’s research group at the University of Tokyo) (Kurt “CyberGuy” Knutsson)

Challenges and solutions

Getting living tissue to stick to a robot isn’t as easy as it sounds. The team had to overcome issues like making sure the tissue could grow into the anchor points properly. They even used plasma treatment to make the surface more “tissue-friendly.”

The researchers also had to consider the size and arrangement of the anchors. Through finite element method simulations, they found that larger anchors provided more tensile strength, but there was a trade-off with the area they occupied.

Scientists create creepy lifelike faces with real human skin for robots

Engineered skin tissue (Shoji Takeuchi’s research group at the University of Tokyo) (Kurt “CyberGuy” Knutsson)

Why this matters

This technology could be a game-changer for fields like prosthetics and humanoid robotics. Imagine prosthetic limbs that look and feel just like real skin or robots that can interact with humans in more natural ways.

The ability to create skin that can move and express emotions opens up new possibilities for human-robot interaction. It could lead to more empathetic and relatable robotic assistants in various fields, from health care to customer service.

Scientists create creepy lifelike faces with real human skin for robots

The smiling robotic face (Shoji Takeuchi’s research group at the University of Tokyo) (Kurt “CyberGuy” Knutsson)

While we’re still a long way from seeing robots with fully functional living skin walking among us, this research from the University of Tokyo opens up exciting possibilities. It’s a step towards creating robots that blur the line between machines and living organisms.

As we continue to advance in this field, we’ll need to grapple with the technical challenges and ethical implications of creating increasingly lifelike machines. Future research might focus on improving the durability of living skin, enhancing its ability to heal or even incorporating sensory capabilities. One thing’s for sure: The future of robotics is looking more human than ever.

Leave a comment

Filed under Uncategorized

After Watson, IBM Looks to Build ‘Brain in a Box’

After Watson, IBM Looks to Build ‘Brain in a Box’

By Jennifer Booton

Your World Tomorrow

Published August 22, 2013

FOXBusiness
  • IBM Watson Supercomputer, IBM
    REUTERS

Imagine Watson with reason and better communication skills.

The Watson supercomputer may be able to beat reigning Jeopardy champions, but scientists at IBM (IBM) are developing new, super-smart computer chips designed from the human brain — and that might ultimately prove much more impressive.

These new silicon “neurosynaptic chips,” which will be fed using about the same amount of energy it takes to power a light bulb, will fuel a software ecosystem that researchers hope will one day enable a new generation of apps that mimic the human brain’s abilities of sensory perception, action and cognition.

It’s akin to giving sensors like microphones and speakers brains of their own, allowing them to consume data to be processed through trillions of synapses and neurons in a way that allows them to draw intelligent conclusions.

IBM’s ultimate goal is to build a chip ecosystem with ten billion neurons and a hundred trillion synapses, while consuming just a kilowatt of power and occupying less than a two-liter soda bottle.

“We want to create a brain in a box.”

– IBM’s Dharmendra Modha 

“We are fundamentally expanding the boundary of what computers can do,” said Dharmendra Modha, principal investigator of IBM’s SyNAPSE cognitive computing project. “This could have far reaching impacts on technology, business, government and society.”

The researchers envision a wave of new, innovative “smart” products derived from these chips that would alter the way humans live in virtually all walks of life, including commerce, logistics, location, society, even the environment.

“Modern computing systems were designed decades ago for sequential processing according to a pre-defined program,” IBM said in a release. “In contrast, the brain—which operates comparatively slowly and at low precision—excels at tasks such as recognizing, interpreting, and acting upon patterns.”

These chips would give way to a whole new “cognitive-type of processing,” said Bill Risk, who works on the IBM Research SyNAPSE Project, marking one of the most dramatic changes to computing since the traditional von Neumann architecture comprised of zeros and ones was adopted in the mid-1940s.

“These operations result in actions rather than just stored information, and that’s a whole different world,” said Roger Kay, president of Endpoint Technologies Associates, who has written about the research. “It really allows for a human-like assessment of problems.”

It is quite a complex system, and it is still in early stages of development. But IBM researchers have rapidly completed the first three phases of what will likely by a multi-stage project, collaborating with a number of academic partners and collecting some $53 million in funding. They are hopeful the pace of advancement will continue.

Modha cautioned, however, this new type of computing wouldn’t serve as a replacement for today’s computers but a complementary sibling, with traditional analog architecture serving as the left brain with its speed and analytic ability, and the next era of computing acting as the right cortex, operating much more slowly but more cognitively.

“Together, they help to complete the computing technology we have,” Modha said.

Providing a real-life example of how their partnership might one-day work, Kay imagined a medical professional giving triage to a patient.

Digital computers would provide basic functions such as the patient’s vitals, while the cognitive computer would cross reference data collected at the scene in real-time with stored information on the digital computer to assess the situation and provide relevant treatment recommendations.

“It could be a drug overdose or an arterial blockage, a human might not know which is which [from the naked eye],” explains Kay. “But a [cognitive] computer could read the symptoms, reference literature, then vote using a confidence level that can kind of infer which one is more likely the case.”

Endless Possibilities Seen

The IBM researchers have put together building blocks of data to make cognitive applications easier to build and to create an ecosystem for developers. The data come in the form of “corelets” that each serve a particular function, such as the ability to perceive sound or colors.

So far they have developed 150 corelets with the intention to eventually allow third parties to go through rigorous testing to submit more. Eventually, corelets could be used to build “real-life cognitive systems,” researchers hope.

To help get the ball rolling, the researchers envisioned a slew of product ideas that would make perfect use of these genius chips in real-world functions.

Here are just a few:

-An autonomous robot dubbed “Tumbleweed” could be deployed for search and rescue missions in emergency situations. Researchers picture the sphere-shaped device, outfitted with “multi-modal sensing” via 32 mini cameras and speakers, surveying a disaster and identifying people in need. It might be able to communicate with them, letting them know help is on its way or directing them to safety.

-For personal use, low-power, light-weight glasses could be designed for the near blind. Using these chips, which would recognize and analyze objects through cameras, they’d be able to plot a route through a crowded room with obstacles, directing the visually-impaired through speakers.

-Putting these chips to use in a business function, the researchers foresee a product they’ve dubbed the “conversation flower” that could process audio and video feeds on conference calls to identify specific people by their voice and appearance while automatically transcribing the conversation.

-Giving a glimpse into its potential use in the medical world, a thermometer could be developed that could not only measure temperature, but could also be outfitted with a camera that would be able to detect smell and recognize certain bacterial presence based on their unique odor, giving an alert if medical attention is needed.

-In an environmental function, researchers could see this technology being outfitted on sensor buoys, monitoring shipping lanes for safety and environmental protection.

Given the fluid motion of the project, it’s unclear how long it will take for the first generation of cognitive computers to begin applying themselves in real-world applications, but Modha and his team are optimistic they will be crafted sooner than later.

“We need cognitive systems that understand the environment, can deal with ambiguity and can act in a real-time, real-life context,” Modha said. “We want to create a brain in a box.”

Read more: http://www.foxbusiness.com/technology/2013/08/22/after-watson-ibm-looks-to-build-brain-in-box/#ixzz2dagDD2vE

Leave a comment

Filed under Humor and Observations