May 12, 2011

Workplace Robots Need a Better View

       Robotics pioneer Rodney Brooks says a new generation of industrial robots could be enabled by better machine vision.
A coming wave of industrial robots will be smart enough to work safely alongside humans in many different settings, says Rodney Brooks, a professor emeritus of robotics at MIT and a founder of iRobot.
Industrial robots have evolved little since the first ones appeared in General Motors factories about 50 years ago, Brooks says. Most workplace robots—such as those used in car manufacturing—are designed to perform simple, repetitive tasks. And they lack the sensory smarts to work safely alongside humans.
"I think there's room for a real revolution by putting sensors and computation into industrial robots," says Brooks. "What if the robots were smarter and they could go into smaller companies and be easier for ordinary people to use?"
If manufacturing robots could recognize their human coworkers and interact with them safely Brooks says they could be used in many more manufacturing environments, assisting with repetitive and physically demanding manual tasks.
In 2008, Brooks founded a new company, called Heartland Robotics, to develop robots for manufacturing. The company has said that its robots will be intelligent, adaptable, and inexpensive. But the company is still in stealth mode, and hasn't revealed what technologies these robots will use.
In the last few years, robotics researchers have made progress in machine vision, due in part to the falling cost of computer power, and the photo and image resources that can be pulled from the Web and used to train computer vision systems to recognize different objects. However, Brooks says, giving machines more human-like vision remains one of the biggest challenges to the development of more practical robots.
"Perception is really, really hard. For robots, I think it's largely unsolved," says Brooks. "Image-based recognition has worked surprisingly well, [but] it can't do the recognition that a three-year-old child can do."
Commercial machine vision systems are still usually focused on a narrow task. For example, some cars now come equipped with a system that can identify pedestrians and other vehicles, even in a cluttered scene. The system, developed by Mobileye, based in Israel, is connected to an onboard computer that applies the brakes if a collision seems imminent.
"This is the first wide-scale, highly demanding use of computer vision," says Amnon Shashua, the Sachs professor of computer science at the Hebrew University of Jerusalem, and a cofounder of Mobileye.
Shashua says the company's computer vision system works well because it only has to identify a handful of objects. But he hopes that within the next five years, the system will be able to reliably recognize almost everything within a scene. "There are at least 1,000 object classes you need to know in an image to at least do semi-autonomous driving," including signs, lights, guard rails, poles, bridges, exits, and more, he said during a symposium on artificial intelligence at MIT last week.
Mobileye is developing specialized hardware to support the specific demands of rapid image recognition. "There's still a long way to go to build hardware that is efficient, low cost, low power, that can do very complex computer vision," Shashua adds.
Better machine vision systems might lead to significant advances in robotics. "How we deploy our robots is limited by what we can do with perception, so improvements in perception will lead them to be smarter and have modicums of common sense," says Brooks.

Apple and Google Defend Their Handling of Data

        Congressional testimony centers around what "location" really means, and who's responsible for how apps behave.
Apple and Google are scrambling to regain trust after revelations about the way smart phones and tablets handle users' location data. In a U.S. Senate subcommittee hearing held today, representatives from Apple and Google stressed that their companies had streamlined and clarified their handling of location-based data. But a key unanswered question is how they'll let third-party app providers share that information.
The problem is that users enjoy location-based services, but most don't understand what happens to the data they share in exchange for using those services. Senators wondered if location data was being stored securely enough to protect users. They pointed to the lack of privacy policies for many mobile apps, and noted that even when users are aware of what happens to their data, they may find it difficult to control.
For example, until Apple's update to iOS last week, someone who opted out of location services wasn't actually turning off all of his device's location-based sharing. Apple said the problem was due to a bug that has since been corrected.
Guy "Bud" Tribble, vice president of software technology for Apple, testified that "Apple does not track customers' locations. Apple has never done so and has no plans to do so."
Tribble said that the location information found on phones represented a portion of a crowd-sourced database that Apple maintains in order to process location information more rapidly than is possible through GPS alone. The company stores the locations of cell towers and Wi-Fi hot spots collected from millions of devices. User devices note which towers and hot spots they can connect to, and use that to quickly deduce location. He said that the information stored on iPhones was never a user's location.

Case Study: Saving Money by Analyzing Trends

            A company that runs hospitals and clinics puts its people to work more effectively after software points out patterns in a huge pool of data.
Like any other business today, ThedaCare, a Wisconsin-based health-care company with five hospitals and more than 20 primary-care clinics, is swimming in data. Nuggets about patients' demographic features, diseases, and treatments are buried in electronic medical records. Payroll tracks the hours employees spend working overtime or on call. The billing system knows how many patients have private insurance and how many rely on government programs like Medicare and Medicaid, which reimburse ThedaCare at a lower rate.
Even from these few examples, you can start to see how it would be possible for ThedaCare to spot trends and deal with them in a more cost-effective way—if those tidbits of information weren't trapped in isolated programs and databases.
Brian Veara, ThedaCare's manager of decision resources, worked for retail and manufacturing companies before he moved into health care more than eight years ago. Health care lags about 10 years behind other industries when it comes to information technology, he says, so he wasn't surprised to find how long it took to produce reports. But he knew something needed to change.
A major part of the solution: "business intelligence" software. That's a type of program that pulls together information from disparate systems, builds relationships between data sets, and lets users explore the information by searching and clicking around through tables, charts, and graphs.
Advertisement
In June 2009, Veara bought a business intelligence program called QlikView from Qlik Technologies. The company has been around since 1993, but until about five years ago, computers were too sluggish to make its software model feasible or affordable for most customers. QlikView's "in-memory" approach means it stores a copy of all the aggregated data in a computer's local memory instead of going back to retrieve information from the disparate systems as needed, so customers can work with it in real time. Among QlikTech's other customers are fire departments, newspapers, consumer products companies, and insurance providers. Individuals can download a free version of the software; some people connect it to their iTunes music library to analyze their listening habits.
Before QlikView, ThedaCare produced about 9,000 reports a month, ranging from descriptions of the patient population (meant for internal use) to assessments of ThedaCare's performance that the government could compare with state and national averages. Every time Veara wanted to pull information for one of those reports, he needed to know ahead of time exactly what he was looking for and where to find it. Asking the wrong question might yield irrelevant information or even a blank document. And each request could take up to 15 minutes to process.

Two Devices Treat Alzheimer's

Pharmaceutical companies developing Alzheimer's drugs have faced one hurdle after another. The most effective treatments are difficult to get into the brain, while those that show success in animals have yet to benefit humans.
Two startup companies aim to solve these problems by targeting the brain electrically rather than chemically. They're both using technologies that have proven successful for other brain disorders. One company plans to use deep brain stimulation, which has been used to treat tens of thousands of Parkinson's patients. The other hopes to find success with transcranial magnetic stimulation, a noninvasive approach used to treat depression and as a research tool to stimulate or inhibit specific parts of the brain.
In deep brain stimulation, electrical pulses are delivered to a dysfunctional part of the brain via a surgically implanted electrode, stimulating neural activity.  The technology is being used or tested for a growing number of disorders, including medication resistant epilepsy, depression, and obsessive compulsive disorder. Neurosurgeon Andres Lozano, at the University of Toronto, became interested in its potential for treating Alzheimer's thanks to an unexpected finding published in 2008. Researchers were testing to see if they could help a morbidly obese patient lose weight by stimulating a part of the brain that governs satiety. Follow-up tests revealed that the patient showed a significant improvement in memory.
Brain imaging revealed that the obesity treatment activated various brain structures involved in memory. Such structures have typically deteriorated in Alzheimer's patients, and Lozano's idea is to use deep brain stimulation to boost activity in the memory circuits that patients have left. Late last year, Lozano formed startup Functional Neuromodulation with Daniel O'Connell, founder of Neuroventures and now its CEO, to commercialize the technology.
It's still unclear how well the approach will work in people with the neurodegenerative disorder. A small study published last fall showed mixed results. The treatment appeared to slow cognitive decline in a few patients, but had no effect in others. However, researchers did find that deep brain stimulation reversed one of the markers of Alzheimer's: impaired glucose metabolism in the brain. Preliminary evidence suggests that it is more effective when used at earlier stages of the disease.