Audi Bets on Bio Gasoline Startup

Startup Global Bioenergies uses genetic engineering to avoid one of the costliest steps in biofuel production.

By Kevin Bullis

Audi is investing in a startup, Paris-based Global Bioenergies, that says it can make cheap gasoline from sugar and other renewable sources. The strategic partnership includes stock options and an unspecified amount of funding.

As with conventional biofuel production, Global Bioenergies technology uses microӧrganisms to ferment sugars to produce fuel. But its process eliminates the second most costly part of producing biofuels—the energy-intensive distillation step. And by making gasoline instead of making ethanol, the startup skirts a major problem hampering growth in biofuels—the fact that the market for ethanol is saturated.

Global Bioenergies has demonstrated its technology in the lab and is building two pilot facilities to produce isobutene, a hydrocarbon that a partner will convert into gasoline through an existing chemical process. The larger of the two pilot facilities will be big enough to support the production of over 100,000 liters of gasoline a year.

The process addresses one of the key challenges with conventional biofuels production—the fuel can kill the microӧrganisms that make it. In a conventional fermentation process, once the concentration of ethanol gets to about 12 percent, it starts to poison the yeast so that it can’t make any more ethanol.

Global Bioenergies has genetically engineered E. coli bacteria to produce a gas (isobutene) that bubbles out of solution, so its concentration in the fermentation tank never reaches toxic levels. As a result the bacteria can go on producing fuel longer than in the conventional process, increasing the output of a plant and reducing capital costs.

The isobutene still needs to be separated from other gases such as carbon dioxide, but Global Energies says this is much cheaper than distillation.

The new process doesn’t address the biggest cost of biofuels today—the cost of the raw materials. It’s designed to run on glucose, the type of sugar produced from corn or sugarcane. But the company is adapting it to work with sugars from non-food sources such as wood chips, which include glucose but also other sugars such as xylose.

 

Audi’s partnership with Global Bioenergies is part of push by the automaker to reduce greenhouse gas emissions in the face of tightening regulations. Audi recently announced two other investments in cleaner fuels. It funded a project to make methane using renewable energy—the methane can be used to run Audi’s natural-gas fueled cars (see “Audi to Make Fuel Using Solar Power”). And it funded Joule Unlimited, which is using photosynthetic microӧrganisms to make ethanol and diesel (see “Audi Backs a Biofuels Startup”).

Is Google Cornering the Market on Deep Learning?

A cutting-edge corner of science is being wooed by Silicon Valley, to the dismay of some academics.

By Antonio Regalado

How much are a dozen deep-learning researchers worth? Apparently, more than $400 million.

This week, Google reportedly paid that much to acquire DeepMind Technologies, a startup based in London that had one of the biggest concentrations of researchers anywhere working on deep learning, a relatively new field of artificial intelligence research that aims to achieve tasks like recognizing faces in video or words in human speech (see “Deep Learning”).

The acquisition, aimed at adding skilled experts rather than specific products, marks an acceleration in efforts by Google, Facebook, and other Internet firms to monopolize the biggest brains in artificial intelligence research.

In an interview last month, before the DeepMind acquisition, Peter Norvig, a director of research at Google, estimated that his company already employed “less than 50 percent but certainly more than 5 percent” of the world’s leading experts in machine learning, the wider discipline of which deep learning is the cutting edge.

Companies like Google expect deep learning to help them create new types of products that can understand and learn from the images, text, and video clogging the Web. And to a significant degree, leading academic scientists have embraced Silicon Valley, where they can command teams of engineers instead of students and have access to the largest, most interesting data sets. “It’s a combination of the computing resources we have and the headcounts we can offer,” Norvig said. “At Google, if you want a copy of the Web, well, we just happen to have one sitting around.”

Yoshua Bengio, an AI researcher at the University of Montreal, estimates that there are only about 50 experts worldwide in deep learning, many of whom are still graduate students. He estimated that DeepMind employed about a dozen of them on its staff of about 50. “I think this is the main reason that Google bought DeepMind. It has one of the largest concentrations of deep learning experts,” Bengio says.

Vying with Google for talent are companies including Amazon, Microsoft, and also Facebook, which in September created its own deep learning group (see “Facebook Launches Advanced AI Effort to Find Meaning in Your Posts”). It recruited perhaps the world’s best-known deep learning scientist, Yann LeCunof New York University, to run it. His NYU colleague, Rob Fergus, also accepted a job at the social network.

 

As advanced machine learning transitions from a primarily scientific pursuit to one with high industrial importance, Google’s bench is probably deepest. Names it has lured from academia into full-time or part-time roles include Sebastian Thrun (who has worked on the company’s autonomous car project); Fernando Pereira, a onetime University of Pennsylvania computer scientist; Stanford’s Andrew Ng; and Singularity University boss Ray Kurzweil.

Last year, Google also grabbed renowned University of Toronto deep-learning researcher Geoff Hinton and a passel of his students when it acquired Hinton’s company, DNNresearch. Hinton now works part-time at Google. “We said to Geoff, ‘We like your stuff. Would you like to run models that are 100 times bigger than anyone else’s?’ That was attractive to him,” Norvig said.

Not everyone is happy about the arrival of the proverbial Google Bus in one of academia’s rarefied precincts. In December, during a scientific meeting in Lake Tahoe, Mark Zuckerberg, the founder and CEO of Facebook, made a surprise appearance accompanied by uniformed guards, according to Alex Rubinsteyn, a bioinformatics researcher at Mount Sinai Medical Center, who complained in a blog post that a cultural “boundary between academia and Silicon Valley” had been crossed.

“In academia, status is research merit, it’s what you know,” Rubinsteyn says. “In Silicon Valley, it’s because you run a company or are rich. And then people around those people also think about getting rich.”

Peter Lee, head of Microsoft Research, told Bloomberg Businessweek that deep learning experts were in such demand that they command the same types of seven-figure salaries as some first-year NFL quarterbacks.

Some have resisted industry’s call. Of the three computer scientists considered among the originators of deep-learning—Hinton, LeCun, and Bengio—only Bengio has so far stayed put in the ivory tower. “I just didn’t think earning 10 times more will make me happier,” he says. “As an academic I can choose what to work on and consider very long-term goals.” Plus, he says, industry grants have started to flow his way as companies realize they’ll soon run out of recruits. This year, he’s planning to increase the number of graduate students he’s training from four to 15.

DeepMind was cofounded two years ago by Demis Hassibis, a 37-year-olddescribed by The Times of London as a game designer, neuroscientist, and onetime chess prodigy. The DeepMind researchers were well known in the scientific community, attending meetings and publishing “fairly high-level” papers in machine learning, although they had not yet released a product, says Bengio.

DeepMind’s expertise is in an area called reinforcement learning, which involves getting computers to learn about the world even from very limited feedback. “Imagine if I only told you what grades you got on a test, but didn’t tell you why, or what the answers were,” says Bengio. “It’s a difficult problem to know how you could do better.”

But in December, DeepMind published a paper showing that its software could do that by learning how to play seven Atari2600 games using as inputs only the information visible on a video screen, such as the score. For three of the games, the classics Breakout, Enduro, and Pong, the computer ended up playing better than an expert human. It performed less well on Q*bert and Space Invaders, games where the best strategy is less obvious.

 

Such skilled computer programs could have important commercial applications, including improving search engines (see “How a Database of the World’s Knowledge Shapes Google’s Future”), and might be particularly useful in helping robots learn to navigate the human world. Google last year acquired several leading robotics companies, including the makers of various types of humanoid robots (see “Google’s Latest Robot Acquisition Is the Smartest Yet.”)

Certainly, large companies wouldn’t be spending so heavily to monopolize talent in artificial intelligence unless they believed that these computer brains will give them a powerful edge. It may sound like a movie plot, but perhaps it’s even time to wonder what the first company in possession of a true AI would do with the power that it provided.

Bengio says not to worry about that. “Industry is interested in applying machine learning, and especially deep learning, to the tasks that they want to solve,” he says. “Those [efforts] are on the way towards AI, but still far from it.”

“Honey Encryption” Will Bamboozle Attackers with Fake Secrets

A new approach to encryption beats attackers by presenting them with fake data.

By Tom Simonite

Ari Juels, an independent researcher who was previously chief scientist at computer security company RSA, thinks something important is missing from the cryptography protecting our sensitive data: trickery.

“Decoys and deception are really underexploited tools in fundamental computer security,” Juels says. Together with Thomas Ristenpart of the University of Wisconsin, he has developed a new encryption system with a devious streak. It gives encrypted data an additional layer of protection by serving up fake data in response to every incorrect guess of the password or encryption key. If the attacker does eventually guess correctly, the real data should be lost amongst the crowd of spoof data.

The new approach could be valuable given how frequently large encrypted stashes of sensitive data fall into the hands of criminals. Some 150 million usernames and passwords were taken from Adobe servers in October 2013, for example.

After capturing encrypted data, criminals often use software to repeatedly guess the password or cryptographic key used to protect it. The design of conventional cryptographic systems makes it easy to know when such a guess is correct or not: the wrong key produces a garbled mess, not a recognizable piece of raw data.

Juels and Ristenpart’s approach, known as Honey Encryption, makes it harder for an attacker to know if they have guessed a password or encryption key correctly or not. When the wrong key is used to decrypt something protected by their system, the Honey Encryption software generates a piece of fake data resembling the true data.

If an attacker used software to make 10,000 attempts to decrypt a credit card number, for example, they would get back 10,000 different fake credit card numbers. “Each decryption is going to look plausible,” says Juels. “The attacker has no way to distinguish a priori which is correct.” Juels previously worked with Ron Rivest, the “R” in RSA, to develop a system called Honey Words to protect password databases by also stuffing them with false passwords.

Juels and Ristenpart will present a paper on Honey Encryption at the Eurocryptcryptography conference later this year. Juels is also working on building a system based on it to protect the data stored by password manager services such as LastPass and Dashlane. These services store all of a person’s different passwords in an encrypted form, protected by a single master password, so that software can automatically enter them into websites.

Password managers are a tasty target for criminals, says Juels. He believes that many people use an insecure master password to protect their collection. “The way they’re constructed discourages the use of a strong password because you’re constantly having to type it in—also on a mobile device in many cases.”

 

Juels predicts that if criminals got hold of a large collection of encrypted password vaults they could probably unlock many of them without too much trouble by guessing at the master passwords. But if those vaults were protected with Honey Encryption, each incorrect attempt to decrypt a vault would yield a fake one instead.

Hristo Bojinov, CEO and founder of mobile software company Anfacto, who has previously worked on the problem of protecting password vaults as a security researcher, says Honey Encryption could help reduce their vulnerability. But he notes that not every type of data will be easy to protect this way since it’s not always possible to know the encrypted data in enough detail to produce believable fakes. “Not all authentication or encryption systems yield themselves to being ‘honeyed.’”

Juels agrees, but is convinced that by now enough password dumps have leaked online to make it possible to create fakes that accurately mimic collections of real passwords. He is currently working on creating the fake password vault generator needed for Honey Encryption to be used to protect password managers. This generator will draw on data from a small collection of leaked password manager vaults, several large collections of leaked passwords, and a model of real-world password use built into a powerful password cracker.

A 96-Antenna System Tests the Next Generation of Wireless

Rice University is testing a highly efficient wireless communications system.

By David Talbot

Even as the world’s carriers build out the latest wireless infrastructure, known as 4G LTE, a new apparatus bristling with 96 antennas taking shape at a Rice University lab in Texas could help define the next generation of wireless technology.

The Rice rig, known as Argos, represents the largest such array yet built and will serve as a test bed for a concept known as “Massive MIMO.”

MIMO, or “multiple-input, multiple-output,” is a wireless networking technique aimed at transferring data more efficiently by having several antennas work together to exploit a natural phenomenon that occurs when signals are reflected en route to a receiver. The phenomenon, known as multipath, can cause interference, but MIMO alters the timing of data transmissions in order to increase throughput using the reflected signals.

MIMO is already used for 4G LTE and in the latest version of Wi-Fi, called 802.11ac; but it typically involves only a handful of transmitting and receiving antennas. Massive MIMO extends this approach by using scores or even hundreds of antennas. It increases capacity further by effectively focusing signals on individual users, allowing numerous signals to be sent over the same frequency at once. Indeed, an earlier version of Argos, with 64 antennas, demonstrated that network capacity could be boosted by more than a factor of 10.

“If you have more antennas, you can serve more users,” says Lin Zhong, associate professor of computer science at Rice and the project’s co-leader. And the architecture allows it to easily scale to hundreds or even thousands of antennas, he says.

Massive MIMO requires more processing power because base stations direct radio signals more narrowly to the phones intended to receive them. This, in turn, requires extra computation to pull off. The point of the Argos test bed is to see how much benefit can be obtained in the real world. Processors distributed throughout the setup allow it to test different network configurations, including how it would work alongside other emerging classes of base stations, known as small cells, serving small areas.

“Massive MIMO is an intellectually interesting project,” says Jeff Reed, director of the wireless research center at Virginia Tech. “You want to know: how scalable is MIMO? How many antennas can you benefit from? These projects are attempting to address that.”

 

An alternative, or perhaps complementary, approach to an eventual 5G standard would use extremely high frequencies, around 28 gigahertz. Wavelengths at this frequency are around two orders of magnitude smaller than the frequencies that carry cellular communications today, allowing more antennas to be packed into the same space, such as within a smartphone. But since 28 gigahertz signals are easily blocked by buildings, and even foliage and rain, they’ve long been seen as unusable except in special line-of-sight applications.

But Samsung and New York University have collaborated to solve this, also by using multi-antenna arrays. They send the same signal over 64 antennas, dividing it up to speed up throughput, and dynamically changing which antennas are used and the direction the signal is sent to get around environmental blockages (see “What 5G Will Be: Crazy Fast Wireless Tested in New York City”).

Meantime, some experiments have been geared toward pushing existing 4G LTE technology further. The technology can, in theory, deliver 75 megabits per second, though it is lower in real-world situations. But some research suggests it can go faster by stitching together streams of data from several wireless channels (see “LTE Advanced Is Poised to Turbocharge Smartphone Data”).

Emerging research done on Argos and in other wireless labs will help to define a new 5G phone standard. Whatever the specifics, it’s likely to include more sharing of spectrum, more small transmitters, new protocols, and new network designs. “To introduce an entirely new wireless technology is a huge task,” Marzetta says.

Android App Warns When You’re Being Watched

Researchers find a way to give Android users prominent warnings when apps are tracking their location.

By David Talbot

A new app notifies people when an Android smartphone app is tracking their location, something not previously possible without modifying the operating system on a device, a practice known as “rooting.”

The new technology comes amid new revelations that the National Security Agency seeks to gather personal data from smartphone apps (see “How App Developers Leave the Door Open to NSA Surveillance”). But it may also help ordinary people better grasp the extent to which apps collect and share their personal information. Even games and dictionary apps routinely track location, as collected from a phone’s GPS or global positioning system sensors.

 

Existing Android interfaces do include a tiny icon showing when location information is being accessed, but few people notice or understand what it means, according to a field study done as part of a new research project led by Janne Lindqvist, an assistant professor at Rutgers University. Lindqvist’s group created an app that puts a prominent banner across the top of the app saying, for example, “Your location is accessed by Dictionary.” The app is being readied for Google Play, the Android app store, within two months.

Lindqvist says Android phone users who used a prototype of his app were shocked to discover how frequently they were being tracked. “People were really surprised that some apps were accessing their location, or how often some apps were accessing their location,” he says.

According to one Pew Research survey, almost 20 percent of smartphone owners surveyed have tried to disconnect location information from their apps, and 70 percent wanted to know more about the location data collected by their smartphone.

The goal of the project, Lindqvist says, is to goad Google and app companies into providing more prominent disclosures, collecting less personal information, and allowing users to select which data they will allow the app to see. A research paper describing the app and the user study can be found here. It was recently accepted for an upcoming computer security conference.

In many cases, location information is used by advertisers to provide targeted ads. But information gained by apps often gets passed around widely to advertising companies (see “Mobile-Ad Firms Seek New Ways to Track You” and “Get Ready for Ads That Follow You from One Device to the Next”).

Google, which maintains the Android platform, has engineered it to block an app from gaining information about other apps. So Lindqvist’s team used an indirect method using a function within Android’s location application programming interface (API) that signals when any app requests location information. “People have previously done this with platform-level changes—meaning you would need to ‘root’ the phone,” says Lindqvist. “But nobody has used an app to do this.”

Google has flip-flopped on how much control it gives users over the information apps can access. In Android version 4.3, available since July of last year, users gained the ability to individually disable and enable apps’ “permissions” one by one, but then Google reversed course in December 2013, removing the feature in an update numbered 4.4.2, according to this finding from the Electronic Frontier Foundation.

The new app and study from Lindqvist’s team could help push Google back toward giving users more control. “Because we know how ubiquitous NSA surveillance is, this is one tool to make people aware,” he says.

The work adds to similar investigative work about Apple’s mobile operating system, iOS. Last year different academic researchers found that Apple wasn’t doing a good job stopping apps from harvesting the unique ID numbers of a device (see “Study Shows Many Apps Defy Apple’s Privacy Advice”). Those researchers released their own app, called ProtectMyPrivacy, that detects what data other apps on an iPhone try to access, notifies the owner, and makes a recommendation about what to do. However, that app requires users to first “jailbreak” or modify Apple’s operating system. Still, unlike Android, Apple allows users to individually control which categories of information an app can access.

“Telling people more about their privacy prominently and in an easy-to-understand manner, especially the location, is important,” says Yuvraj Agarwal, who led that research at the University of California, San Diego, and has since moved on to Carnegie Mellon University. Ultimately, though, Agarwal believes users must be able to take action on an app’s specific permissions. “If my choice is to delete Angry Birds or not, that’s not really a choice,” he says.

Chasing the Dream of Half-Price Gasoline from Natural Gas

A startup called Siluria thinks it’s solved a mystery that has stymied huge oil companies for decades.

By Kevin Bullis

At a pilot plant in Menlo Park, California, a technician pours white pellets into a steel tube and then taps it with a wrench to make sure they settle together. He closes the tube, and oxygen and methane—the main ingredient of natural gas—flow in. Seconds later, water and ethylene, the world’s largest commodity chemical, flow out. Another simple step converts the ethylene into gasoline.

The white pellets are a catalyst developed by the Silicon Valley startup Siluria, which has raised $63.5 million in venture capital. If the catalysts work as well in a large, commercial scale plant as they do in tests, Siluria says, the company could produce gasoline from natural gas at about half the cost of making it from crude oil—at least at today’s cheap natural-gas prices.

If Siluria really can make cheap gasoline from natural gas it will have achieved something that has eluded the world’s top chemists and oil and gas companies for decades. Indeed, finding an inexpensive and direct way to upgrade natural gas into more valuable and useful chemicals and fuels could finally mean a cheap replacement for petroleum.

Natural gas burns much more cleanly than oil—power plants that burn oil emit 50 percent more carbon dioxide than natural gas ones. It also is between two and six times more abundant than oil, and its price has fallen dramatically now that technologies like fracking and horizontal drilling have led to a surge of production from unconventional sources like the Marcellus Shale. While oil costs around $100 a barrel, natural gas sells in the U.S. for the equivalent of $20 a barrel.

But until now oil has maintained a crucial advantage: natural gas is much more difficult to convert into chemicals such as those used to make plastics. And it is relatively expensive to convert natural gas into liquid fuels such as gasoline. It cost Shell $19 billion to build a massive gas-to-liquids plant in Qatar, where natural gas is almost free. The South African energy and chemicals company Sasol is considering a gas-to-liquids plant in Louisiana that it says will cost between $11 billion and $14 billion. Altogether, such plants produce only about 400,000 barrels of liquid fuels and chemicals a day, which is less than half of 1 percent of the 90 million barrels of oil produced daily around the world.

The costs are so high largely because the process is complex and consumes a lot of energy. First high temperatures are required to break methane down into carbon monoxide and hydrogen, creating what is called syngas. The syngas is then subjected to catalytic reactions that turn it into a mixture of hydrocarbons that is costly to refine and separate into products.

For years, chemists have been searching for catalysts that would simplify the process, skipping the syngas step and instead converting methane directly into a specific, desired chemical. Such a process wouldn’t require costly refining and separation steps, and it might consume less energy. But the chemistry is difficult—so much so that some of the world’s top petroleum companies gave up on the idea in the 1980s.

Siluria thinks it can succeed where others have failed not because it understands the chemistry better, but because it has developed new tools for making and screening potential catalysts. Traditionally, chemists have developed catalysts by analyzing how they work and calculating what combination of elements might improve them. Siluria’s basic philosophy is to try out a huge number of catalysts in the hope of getting lucky. The company built an automated system—it looks like a mess of steel and plastic tubes, mass spectrometers, small stainless steel furnaces, and data cables—that can quickly synthesize hundreds of different catalysts at a time and then test how well they convert methane into ethylene.

The system works by varying both what catalysts are made of—the combinations and ratios of various elements—and their microscopic structure. Siluria was founded based on the work of Angela Belcher, a professor of biological engineering at MIT who developed viruses that can assemble atoms of inorganic materials into precise shapes. Siluria uses this and other methods to form nanowires from the materials that make up its catalysts. Sometimes the shape of a nanowire changes the way the catalyst interacts with gases such as methane—and this can transform a useless combination of elements into an effective one. “How you build up the structure of the catalyst matters as much as its composition,” says Erik Scher, Siluria’s vice president of research and development.

The process of making and testing catalysts isn’t completely random—Siluria has the work of earlier chemists to guide it, and it has developed software that sorts out the most efficient way to screen a wide variety of possibilities. The result is that what used to take chemists a year Siluria can now do in a couple of days, Scher says. “We’ve made and screened over 50,000 catalysts at last count,” he says. “And I haven’t been counting in a while.”

Nonetheless, some seasoned chemists are skeptical that Siluria can succeed. Siluria’s process is a version of one that chemists pursued in the 1970s and 1980s known as oxidative coupling, which involves reacting methane with oxygen. The problem with this approach is that it’s hard to get the reaction to stop at ethylene and not keep going to make carbon dioxide and water. “The reaction conditions you need to convert methane to ethylene do at least as good a job, if not better, of converting ethylene into carbon dioxide, which is useless,” says Jay Labinger, a chemist at the Beckman Institute at Caltech.

In the late 1980s, Labinger wrote a paper that warned researchers not to waste their time working on the process. And history seems to have borne him out. The process “hasn’t been, and doesn’t appear at all likely to be” an economically viable one, he says.

Yet in spite of the challenging chemistry, Siluria says the performance of its catalysts at its pilot plant have justified building two larger demonstration plants—one across San Francisco Bay in Hayward, California, that will make gasoline, and one in Houston that will only make ethylene. The plants are designed to prove to investors that the technology can work at a commercial scale, and that the process can be plugged into existing refineries and chemical plants, keeping down capital costs. The company hopes to open its first commercial plants within four years.

Siluria can’t tell you exactly how it’s solved the problem that stymied chemists for decades—if indeed it has. Because of the nature of its throw-everything-at-the-wall approach, it doesn’t know precisely how its new catalyst works. All it knows is that the process appears to work.

The hope for finding more valuable uses for natural gas—and making natural gas a large-scale alternative to oil—doesn’t rest on Siluria alone. The abundance of cheap natural gas has fueled a number of startups with other approaches. Given the challenges that such efforts have faced, there’s good reason to be skeptical that they will succeed, says David Victor, director of the Laboratory on International Law and Regulation at the University of California at San Diego. But should some of them break through, he says, “that would be seismic.”

The Power to Decide

What’s the point of all that data, anyway? It’s to make decisions.

By Antonio Regalado

Back in 1956, an engineer and a mathematician, William Fair and Earl Isaac, pooled $800 to start a company. Their idea: a score to handicap whether a borrower would repay a loan.

It was all done with pen and paper. Income, gender, and occupation produced numbers that amounted to a prediction about a person’s behavior. By the 1980s the three-digit scores were calculated on computers and instead took account of a person’s actual credit history. Today, Fair Isaac Corp., or FICO, generates about 10 billion credit scores annually, calculating 50 times a year for many Americans.

This machinery hums in the background of our financial lives, so it’s easy to forget that the choice of whether to lend used to be made by a bank manager who knew a man by his handshake. Fair and Isaac understood that all this could change, and that their company didn’t merely sell numbers. “We sell a radically different way of making decisions that flies in the face of tradition,” Fair once said.

This anecdote suggests a way of understanding the era of “big data”—terabytes of information from sensors or social networks, new computer architectures, and clever software. But even supercharged data needs a job to do, and that job is always about a decision.

10 billion
Credit scores generated each year

In this business reportMIT Technology Review explores a big question: how are data and the analytical tools to manipulate it changing decision making today? On Nasdaq, trading bots exchange a billion shares a day. Online, advertisers bid on hundreds of thousands of keywords a minute, in deals greased by heuristic solutions and optimization models rather than two-martini lunches. The number of variables and the speed and volume of transactions are just too much for human decision makers.

Of course, there’s danger in letting the data decide too much. In this report, Duncan Watts, a Microsoft researcher specializing in social networks, outlines an approach to decision making that avoids the dangers of gut instinct as well as the pitfalls of slavishly obeying data. In short, Watts argues, businesses need to adopt the scientific method (see “Scientific Thinking in Business”).

To do that, they have been hiring a highly trained breed of business skeptics called data scientists. These are the people who create the databases, build the models, reveal the trends, and, increasingly, author the products. And their influence is growing in business. This could be why data science has been called “the sexiest job of the 21st century.” It’s not because mathematics or spreadsheets are particularly attractive. It’s because making decisions is powerful.

When there’s a person in the loop, technology takes a softer approach (see “Software That Augments Human Thinking”). Think of recommendation engines on the Web that suggest products to buy or friends to catch up with. This works because Internet companies maintain statistical models of each of us, our likes and habits, and use them to decide what we see. In this report, we check in with LinkedIn, which maintains the world’s largest database of résumés—more than 200 million of them. One of its newest offerings is University Pages, which crunches résumé data to offer students predictions about where they’ll end up working depending on what college they go to (see “LinkedIn Offers College Choices by the Numbers”).

These smart systems, and their impact, are prosaic next to what’s planned. Take IBM. The company is pouring $1 billion into its Watson computer system, the one that answered questions correctly on the game show Jeopardy! IBM now imagines computers that can carry on intelligent phone calls with customers, or provide expert recommendations after digesting doctors’ notes. IBM wants to provide “cognitive services”—computers that think, or seem to (see “Facing Doubters, IBM Expands Plans for Watson”).

Andrew Jennings, chief analytics officer for FICO, says automating human decisions is only half the story. Credit scores had another major impact. They gave lenders a new way to measure the state of their portfolios—and to adjust them by balancing riskier loan recipients with safer ones. Now, as other industries get exposed to predictive data, their approach to business strategy is changing, too. In this report, we look at one technique that’s spreading on the Web, called A/B testing. It’s a simple tactic—put up two versions of a Web page and see which one performs better (see “Seeking Edge, Websites Turn to Experiments” and “Startups Embrace a Way to Fail Fast”).

Until recently, such optimization was practiced only by the largest Internet companies. Now, nearly any website can do it. Jennings calls this phenomenon “systematic experimentation” and says it will be a feature of the smartest companies. They will have teams constantly probing the world, trying to learn its shifting rules and deciding on strategies to adapt. “Winners and losers in analytic battles will not be determined simply by which organization has access to more data or which organization has more money,” Jennings has said.

Hacking the Immune System to Prevent Damage after a Heart Attack

Microparticles that block the body’s immune response to damaged tissue could help prevent further harm.

By Mike Orcutt

Using tiny biodegradable particles to disrupt the body’s normal immune response after a heart attack could help save patients from tissue damage and certain long-term health problems that often follow. Researchers have shown that injecting such particles into mice within 24 hours of a heart attack not only significantly reduces tissue damage, but also results in those mice having stronger cardiac function 30 days later. The inventors of the new technology now plan to pursue human trials.

Much of the tissue damage that results from a heart attack is the result of inflammation, the body’s natural response to harmful stimuli such as damaged muscle. But in the case of a heart attack, these immune cells do more harm than good, explains Daniel Getts, inventor of the new therapy and chief scientific officer of Cour Pharmaceutical Development. The system’s weaponry is “fairly generic,” he says. While the toxic compounds that the immune cells secrete can be beneficial in defending the body against an infection, they also cause tissue damage. This phenomenon occurs not only after heart attacks, but also in a range of other diseases, including West Nile Virus, inflammatory bowel disease, and multiple sclerosis.

The 500-nanometer particles must be negatively charged, and can be made of several different materials, including the one used for biodegradable sutures. The new research suggests that once the particles are in the bloodstream, the negative charge attracts a specific receptor on the surface of inflammatory monocytes. The particles bind to that receptor and divert the immune cells away from the heart and toward the spleen, where they die.

Preventing these cells from reaching the heart allows the damaged muscle to regenerate “along more regulated processes,” says Getts. Should the therapy translate to humans, he says, it has the potential to substantially reduce the long-term health drawbacks that some heart attack patients experience, including shortness of breath and limited ability to exercise.

The goal is to begin human tests by early next year. The company hopes the relatively simple mechanism of the therapy, and the fact that the material the particles are made of, polyglycolic acid, is already approved by the U.S. Food and Drug Administration, will speed the development process.

But “there is still some homework to do,” in particular the teasing out of any potential side effects the microparticles might produce, says Matthias Nahrendorf, a professor of systems biology at Harvard. For example, the particles may activate the immune system in some yet-unknown way, he says. In addition, it will be important to determine how to administer the therapy so that it doesn’t compromise these cells’ ability to help in healing, and to defend the body against infection and other foreign invaders, says Nahrendorf.

Startup Thinks Its Battery Will Solve Renewable Energy’s Big Flaw

Aquion has started production of a low-cost sodium-ion battery aimed at making renewable energy viable.

By Kevin Bullis

A former Sony TV factory near Pittsburgh is coming to life again after lying idle for four years. Whirring robotic arms have started to assemble a new kind of battery that could make the grid more efficient and let villages run on solar power around the clock.

Aquion, the startup that developed the battery, has finished installing its first commercial-scale production line at the factory, and is sending out batteries for customers to evaluate. It recently raised $55 million of venture capital funding from investors including Bill Gates. The money will help it ramp up to full-speed production by this spring.

Jay Whitacre, the Carnegie Mellon professor of materials science and engineering who invented the new battery, says it will cost about as much as a lead-acid battery—one of the cheapest types of battery available—but will last more than twice as long. And while lead is toxic and the sulfuric-acid electrolyte in lead-acid batteries is potentially dangerous, the new battery is made of materials so safe you can eat them (although Whitacre says they taste terrible). Nontoxic materials are also a good fit for remote areas, where maintenance is difficult.

Most importantly, by providing an affordable way to store solar power for use at night or during cloudy weather, the technology could allow isolated populations to get electricity from renewable energy, rather than from polluting diesel generators. Combining solar power and inexpensive batteries would also be cheaper than running diesel generators in places where delivering fuel is expensive (see “How Solar-Based Microgrids Could Bring Power to Millions”).

The batteries could allow the grid to accommodate greater amounts of intermittent renewable energy. As Aquion scales up production and brings down costs, the batteries could also be used instead of a type of natural gas power plant—called a peaker plant—often used to balance supply and demand on the grid. When recharged using renewables, the batteries don’t need fuel, so they’re cleaner than the natural gas power plants.

In some places, concerns over pollution make new natural gas plants hard to build, which could create an opening for Aquion’s technology, even if it’s somewhat more expensive.

Much of the sprawling factory where Aquion is setting up shop is derelict, with potholes in the floor and piles of abandoned ductwork and manufacturing equipment left over from making old-fashioned cathode ray tube TVs. Aquion has tidied up a section of the factory and installed equipment that’s ordinarily used for making aspirin tablets or wrapping chocolates in foil and arranging them in boxes. Now the equipment stamps out and precisely assembles battery electrodes along with foil current collectors to make batteries the size of briefcases.

By using cheap equipment originally developed for large, existing industries, Aquion is keeping down costs, Whitacre says.

The battery is made of inexpensive materials including manganese oxide and water. In concept, it operates much like a lithium-ion battery, in which lithium ions shuttle between electrodes to create electrical current. But the new battery uses sodium ions instead of lithium ones, which makes it possible to use a salt water electrolyte instead of the more expensive—and flammable—electrolytes used in lithium-ion batteries.

The trade-off is that the batteries store less energy by weight and volume than lithium-ion batteries do, so they’re not practical for cars or portable electronics. But space isn’t as much of an issue for stationary applications, where batteries can be stacked in warehouses or shipping containers. For storing large amounts of power from the grid, success is “all about cost,” Whitacre says.

Aquion will need to compete with companies such as GE and Fluidic Energy, which are also manufacturing novel batteries for the grid (see “GE’s Novel Battery to Bolster the Grid” and “Years in the Making, Promising Rechargeable Metal-Air Batteries Head to Market”).

Power Electronics Smooth Solar Transition

New devices address instability caused by high penetration of distributed solar.

By Martin LaMonica

As rooftop solar panels become increasingly popular, utilities are growing concerned that they will put pressure on local grids, destabilizing power service and requiring costly equipment upgrades.

The rapid adoption of solar photovoltaics has already prompted changes in Germany and parts of Hawaii, California, and New Jersey. Because nearly 10 percent of Hawaiian Electric’s customers have rooftop solar, the utility now requires solar contractors and customers on the island of Oahu to get approval before installing a PV system. It’s also developing a model for sharing the cost of studying what upgrades may be required to add another rooftop solar system, says a spokesperson for the local utility.

To address the instability caused by distributed solar, startup Gridco Systems is introducing a product that uses power electronics to smooth out spikes in voltage caused by solar generators. The company thinks its ground- or pole-mounted devices will create a distributed control infrastructure to monitor and manage the flow of power for a number of uses, including solar integration.

Today’s electromechanical systems, such as capacitor banks or voltage regulators at substations, can take minutes to adjust voltage and are far removed from the solar installations where the problems occur. Meanwhile, prices have come down for power electronics, devices that can change the properties of electricity and precisely control the amount of power going to various applications. That means the technology is more economical for use in the power grid, says Naimish Patel, the CEO of Gridco Systems, which has raised $30 million from venture capitalists.

If voltage on a circuit goes too high, it can endanger utility crews and cause damage to both utility and customer equipment. Distributed solar can also cause reliability problems if there’s a fault on the grid. Power plants can ride through disturbances, but solar PV generators are designed to shut down immediately, which can cause a spike in demand for power.

As a general rule, when solar power represents more than 15 percent of the peak-time load, utilities will want to analyze the potential impact. Given the fast adoption of solar, utility executives say that some sort of planning for equipment upgrades is required. A group of 16 western utilities called the Western Electric Industry Leaders wrote a letter last year to policy makers calling for rules that require the installation of so-called smart inverters. These devices can convert solar panels’ direct current to household alternating current, and they can also address problems with power quality.

The U.S. can avoid what happened in Germany, where utilities and the solar industry spent hundreds of millions of dollars on equipment upgrades, the utility executives wrote: “These new smart inverters will only cost about $150 more than current inverters, approximately one percent of the overall cost (of a solar installation). This is a bargain price given the expensive retrofit process in Germany.”

 

Devices based on power electronics, like Gridco’s, are more expensive than smart inverters but offer more features, analysts say. Gridco’s product, for instance, can be used to regulate voltage in solar-heavy circuits and to support voltage for better efficiency as well. Because they can connect to utilities’ communications networks, utilities can also directly control them, something that is hard to do with inverters on customer premises.

Gridco says its power regulators cost between $5,000 and $8,000. One device could regulate a single cluster of homes with rooftop PV, but managing voltage fluctuations in a utility-scale installation could require multiple devices.