Popular Posts

Saturday, June 15, 2013

3D printing powered by thought

Imagine if you could print objects just by thinking about them. In this article it is investigated whether this is far-fetched dream or a real possibility.

Image: Thinker Thing, a start-up based in Santiago, Chile, says it has developed a way of printing 3D objects from people’s thoughts. 

It’s definitely not a bird. Nor is it a plane. The garish orange piece of plastic, small enough to hold in the palm of a hand, could pass for a missing limb of a toy tyrannosaurus. It may not look all that impressive, but it’s notable for two reasons. One is that the monster arm has emerged from a 3D printer. The other is that it is, in fact, the first ever object made from thought.
This milestone was reached with little fanfare last month at the Santiago MakerSpace, a technology and design studio in the Chilean capital. The toy limb’s shape was determined according to the wishes of its designer, as gleaned from a headset picking up his brainwaves. The man in question was George Laskowsky, Chief Technical Officer of Thinker Thing, the Chilean start-up developing the mind-controlled 3D printing system.
Image: This toy arm is the first object from thought successfully created by the start-up.

Engineers and designers have been using 3D printers for more than two decades. More recently, prices have tumbled and desk-top devices are increasingly being pitched at consumers. The touted possibilities appear to be endless – from bones to buildings to burritos – making some observers predict revolutionary consequences like the eventual demise of the factory. Because 3D printers build objects layer by layer from materials such as plastic or metal dust, a key advantage is the comparative freedom they give designers. Yet the design software is not easy to master, especially if you are four-years-old and haven’t yet learnt to hold a pencil properly.
Image: The shape and form of the toy limb was determined from a headset picking up a person’s brainwaves.

“What is the point of these printers if my son cannot design his own toy?” says Bryan Salt, CEO of Thinker Thing. “I realised that while there were a lot of people talking about the hardware of the printer no-one really seemed to be talking about how to actually use it.” In theory 3D printers could help unleash our inner creativity, freeing us from the constraints of traditional production methods. However, in practice those unwilling or unable to plough through the software instruction manual could be left downloading ready- made models designed by others.
That’s where Emotional Evolutionary Design (EED), the software that allows Thinker Thing to interpret its users’ thoughts, comes in. Its current role is to power the Monster Dreamer Project, which will allow users to design their own fantastical creatures using the power of thought. Chilean children will get the first opportunity to try it out during tour of schools in the country at the end of this month.
Image: The idea is that even a young person would be able to create 3D objects from their imagination

When those children sit in front of a computer running Monster Dreamer, they will be presented with a series of different body shapes in bubbles. These will mutate randomly, with built-in rules preventing them becoming too abstract. The children’s reactions to the changes will be picked up by an Emotiv EPOC headset, a $300 electroencephalography (EEG) device designed to pick up the electrical signals from brain cell interactions using fourteen sensors on the scalp. As different brain states such as excitement or boredom generate specific patterns of brain activity, the computer can identify the shapes associated with positive emotional responses. The favoured shapes will grow bigger on the screen, while the others shrink. The biggest shapes are combined to generate a body part, and the process is repeated for different body parts until the monster is complete. The final result should be a unique 3D model that is ready for printing as a solid object.
Second nature
Image: An early version of the so-called Emotional Evolutionary Design system that will allow people to evolve a design with their mind.

Design steered by emotional responses is based on the notion that most people are better at critiquing a design than they are at thinking of new ideas from scratch, especially if they have no training. “One of the biggest bottlenecks right now with 3D printing is content,” says Professor Hod Lipson, director of the Creative Machines Lab at Cornell University, in Ithaca, New York State. “We have iPods with no music. We have machines that can make almost anything but we do not have a lot of things to make with them.”
Lipson’s lab is also working on evolving 3D models with the mind.EndlessForms, created by two of Lipson’s students, is a website that mimics nature’s way of creating new designs in small steps. At the start of the process users are presented with 15 three-dimensional shapes. Clicking on any two will combine them and produce fifteen new shapes based on those choices. If you wanted to make a cat, you might click on one shape with the semblance of a muzzle and another with two pointed, ear-like triangles on top. The computer would then offer up a series of new shapes that more closely resemble the cat you have in mind, and so on until the model reaches the desired shape. 
Image: Chilean children will get the first opportunity to try it out later this month when they will turn their thoughts to changing monster designs

To reduce the time spent clicking, the researchers came to the same conclusion as the Thinker Thing team – feeding users’ thoughts back into the computer directly could make the process quicker. So, last year, the team used Emotiv EPOC headsets to read users’ brain signals and therefore determine their reactions. But then they ran into a problem. “At some point we were thinking it was only measuring the level of sweat because we were actually trying so hard to feel happy or sad about something,” says Lipson. No matter how many times they tried, the scientists could not find a reliable signal to use from the headset.
The problem with cheap consumer headsets is that the signals they pick up are already weak. The skull dampens the small electrical impulses from the brain’s neurons and electrical signals from nearby facial muscles can overpower them. Some sceptics argue that consumer EEG devices are not really measuring thoughts at all.
Others argue that for applications that only require basic feedback such as yes or no, the readings they generate can be accurate enough. “If they are simple positive or negative emotions, it can be 100%,” says Dr Olga Sourina, head of the Cognitive Human Computer Interaction Lab at the Nanyang Technological University, Singapore. “When you need to differ between more emotions like anger or fear, they can be less accurate.” Sourina has spent nine years working on improving the ability of computers to recognise emotions from EEG.
With some of the limitations of consumer EEG technology in mind, Lipson and colleagues decided to monitor thoughts via the eyeballs. Eye tracking can identify the shapes that get the most attention, and this could be used to shape design processes. The snag is that without an EEG headset it is not possible to tell whether someone is looking at a shape because they find it strange or beautiful. Even so, the team’s research so far suggests that at the end of the process participants still feel they have managed to reach the desired design. There’s some way to go, but in the future a combination of brain scanning and eye tracking may be preferred to the trusty old mouse when it comes to 3D object design.

Image: The final result should be a unique 3D model that is ready for printing as a solid object
Dream maker
If and when that day comes, it still won’t be a case of closing your eyes, imagining a unicorn and hearing the printer take off. But recent work by two teams of neuroscientists suggests the idea of translating whole images from the mind to the design screen may not be as improbable as it sounds. “In principle, it is not farfetched at all,” says Professor Jack Gallant, from the University of California, Berkeley. “We have already published many papers where we reconstruct photographs or movies from what people have seen.”
In 2011, Gallant’s team used a magnetic resonance imaging (MRI) scanner to show how computers can be trained to read the minds of those watching moving images. They built up a database of the activity in a key visual centre of the brain as three fellow researchers watched a compilation of Hollywood film trailer clips. Next the subjects watched a new set of clips. Based on the brain activity this generated, and the database created from the first phase of the experiment, the computer was asked to select segments of 5,000 hours of randomly selected YouTube clips that best matched the second sequence of clips. The results, although blurry, were recognisable as copies of the originals, and demonstrated for the first time the ability to decode moving images from the brain.
A related advance came in April this year when Japanese scientists led by Yukiyasu Kamitani, of the ATR Computational Neuroscience Laboratories in Kyoto, revealed they had made significant steps towards automated dream decoding. Three people had their brains scanned in a MRI machine while they slept. They were awoken when EEG signals indicated they had reached an early phase of sleep associated with dreaming. The researchers then asked them to describe their dreams, with the process being repeated until more than 200 reports had been collected for each person.
Kamitani’s group then chose 20 categories of objects and scenes based on the words that occurred most frequently in the descriptions. They selected photos representing each category, and scanned their volunteers’ brains while they looked at them. Comparing the scans taken while participants were awake with those while they were dreaming allowed the accurate prediction of dream content 60-70% of the time, depending on the individuals, the brain areas involved and different objects and scenes involved. It may not yet be a fully formed “dream decoder”, but it does show that direct decoding mental images is a possibility.
MRI scanners may be better at reading the brain than cheap EEG headsets, but that does not make them a practical or affordable solution for 3D printing. “As a giant three million dollar magnet, it is not something you would just wear around,” says Gallant.
Even if these issues can be overcome, there are other obstacles. Seeing an object and imagining one may not produce the same brain signals. On top of this individuals vary widely in their abilities to dream up designs from scratch, and in the level of detail they can imagine. “I have been doing 3D modelling since it began back in the 80s,” said Salt. “And the process is that you build something and then you move it about. You do not sit down and think, I have something absolutely finite in my head and that is what I am going to build.” 
These challenges suggest the idea of 3D printing guided by fully formed mental images of users is, if not entirely farfetched, a long way from becoming reality. Combining sensors that can pick up human emotions with design software that can interpret and respond to them looks like the nearest we’re going to get to creating 3D objects from thought in the near future.
So Thinker Thing’s twig-like orange monster arm, as unsophisticated as it may appear at first glance, may one day be celebrated as marking the start of a new and exciting way of moulding the things around us. “It is really something magical to be there, sat without moving a limb, and watching the designs evolve into something that you were thinking about,” says Laskowsky.
Source: http://www.bbc.com

No comments:

Post a Comment