Organ-on-a-Chip Mimics Deadly Lung Condition

By Susan Young on November 7, 2012

Researchers at the Wyss Institute for Biologically Inspired Engineering at Harvard University have shown that their “lung-on-a-chip” technology can mimic a life-threatening lung condition. They also report that scientists can uncover new aspects of the disease using the lung chip that would not be found with animal experiments.

The study, published in today’s Science Translational Medicine, is the first definitive demonstration that the institute’s organ-mimicking chips, which include a gut, a heart, and a kidney (see “Building an Organ on a Chip”), can be used to model a disease and even test candidate drugs.

The lung-on-a-chip device is a clear, flexible thumb-sized block of polymer perforated by two tiny channels separated by a thin membrane. Air flows through one channel, which is lined with human lung cells; a nutrient-rich liquid that acts as a blood substitute flows through the other, which is lined with blood-vessel cells. A vacuum applied to chip moves the channels to re-create the way human lung tissues physically expand and contract when breathing.

The study, led by Wyss Institute fellow Dongeun Huh, focused on pulmonary edema, a condition in which fluid and blood clots fill the lungs. It can be caused by heart failure as well as the side effects of a common cancer drug. The researchers injected the cancer drug into the blood-vessel-like channel and found that fluid and blood plasma proteins leaked across the membrane into the air channel, similar to the drug’s side effect in patients.

This led to two surprising discoveries, says study coauthor and Wyss lead staff scientist Geraldine Hamilton. One was that the immune system, which was not represented in the chip, was not required to cause the leakage side effect as had been previously thought. Second, the team found that when they turned on the vacuum system to create breathing-like movements, the leakage worsened, another unknown aspect of pulmonary edema.

The researchers also show that a GlaxoSmithKline drug candidate could prevent the leakage in the chip system (GSK researchers were also coauthors on the Wyss study). In a separate study in the same issue of Science Translation Medicine, GSK researchers demonstrate in mice with heart failure that their drug can reduce pulmonary edema, helping to validate the chip system, says Hamilton. “The reality is that animals will be required for clinical testing for many years to come, but this moves us a step closer to finding alternatives,” she says.

There are skeptics. Organ-on-chip systems lack the typical environment that an organ would be exposed to, such as the various hormones and other molecular cues that are constantly being circulated throughout the body of an organism, says Michael Hayward, a lead scientist at Cranbury, New Jersey-based life sciences company Taconic. Hayward, who specializes in developing animal models of human disease, also notes that most diseases involve many organs, and understanding how different organs interact to cause a disease state would be out of the grasp of a single organ-mimicking device.

Hamilton acknowledges that both industry and regulators are going to want lots of validation of the organ-on-chip technology before using it as an alternative to animals, but the potential benefits of the chip technology are evident in today’s study, she says. “Not only do we mimic clinical response, but we also found out something new. This is a glimpse into the effects this could have on drug discovery and development in the future,” she says. “Not only could you replace the animal, but you gain further insight.”

And one day, they may begin to address the concerns of Hayward and others about the isolated nature of their devices. “Our ultimate goal, which is high risk, is not only to develop disease models but to develop an integrated body-on-a-chip, where we can start to link these organs, moving us a step closer to mimicking the whole human response,” she says.

ABB Advance Makes Renewable-Energy Supergrids Practical

 

November 12, 2012

ABB, the large power and automation company, has developed technology that could provide an efficient way to transmit power from widely distributed solar panels, wind turbines, and other sources of renewable energy. The new technology is a fast and efficient circuit breaker for high-voltage direct-current (DC) power lines, a device that has eluded technologists for 100 years. The breaker makes it possible to join high-voltage DC transmission lines to form a resilient power grid.

If renewable energy is ever to account for a large part of the total energy supply, countries will need to install new, large-scale transmission grids, both to get power to cities from remote areas such as deserts that often have the best renewable resources, and to combine power from widely distributed wind turbines and solar panels, which can help average out fluctuations in their output. In Europe, there’s been talk for years of a supergrid that would pull together power from hydroelectric dams in Scandinavia with wind farms in Germany and large solar farms in Spain and even North Africa (see “A Supergrid for Europe”).

DC lines have long been used to transmit power across the North Sea, and from large hydroelectric dams to cities. But until ABB’s advance, it wasn’t safe to connect DC lines into a large-scale grid.

ABB’s circuit breaker changes that. Within five milliseconds it can stop the flow of a huge amount of power—equal to the entire output of a nuclear power plant, ABB says. The breakers could be used to nearly instantaneously reroute power in a DC grid around a problem, allowing the grid to keep functioning. “Ordinarily, if something goes wrong anywhere, all the power goes off,” says Claes Rytoft, ABB’s chief technology officer. “The breaker can cut out the faulty line and keep the rest healthy.”

Researchers have been trying to develop high-voltage DC circuit breakers for a century (see “Edison’s Revenge: The Rise of DC Power”). Mechanical switches alone didn’t work—they shut off power too slowly. Power electronics made of transistors that can switch on and off large amounts of power offered a possible solution, but they proved far too inefficient. ABB’s solution combines power electronics with a mechanical switch to create a hybrid system that’s both fast and efficient. The new circuit breaker could also be far less expensive than systems that use only transistors.

“The cost of the power electronics breaker was humongous,” says Ram Adapa, a power delivery technical leader at the Electric Power Research Institute. “The hybrid breaker should be less costly.”

With the major hurdle to DC grids out of the way, ABB is now developing algorithms to control them. The system will still need to work in concert with AC lines for distributing the power in local communities, since there is no inexpensive DC equivalent of the transformers needed to step down power to the relatively low voltages used in homes and businesses. One of the first markets for the new technology could be Germany, which has decided to turn off its nuclear power plants and rely heavily on renewable energy (see “The Great German Energy Experiment”).

The degree to which high-voltage DC grids can help renewables may depend on the economics of installing underground cables versus overhead lines. Obtaining rights-of-way is one of the biggest obstacles to installing new transmission lines in many countries, and underground installations don’t require obtaining new rights-of-way, since they can be easily installed along existing roadways. ABB says that when the entire system cost is taken into account, underground installations are only slightly more expensive than overhead ones. But Adapa is skeptical, saying that underground installations could cost five times as much.

 

Breakthrough Offers a Better Way to Make Drugs

A MIT-Novartis collaboration could be a boost for so-called “continuous flow” manufacturing

By Susan Young on November 6, 2012

Despite the huge amounts of money that the pharmaceutical industry spends on drug discovery, it is notoriously old-fashioned in how it actually makes its products. Most drugs are made in batch processes, in which the ingredients, often powders, are added in successive and often disconnected steps. The process resembles a bakery more than it does a modern chemistry lab. That could be about to change.

This summer, a team of researchers from MIT and Swiss pharmaceutical company Novartis proved that a continuous production line that integrated several new chemical processes and equipment specially designed for the project could make a higher quality drug faster, and in a less wasteful manner.  This more nimble method may even create more opportunities in early drug discovery. In their continuous-manufacturing process, raw ingredients are fed into a parade of heaters, spinners, extractors, and sensors that relay the intermediates through chemical reactions. At the end, round, coated pills fall out.

Earlier this year, Novartis CEO Joseph Jimenez said that his company plans to build a commercial-scale continuous-manufacturing facility by 2015 (see “The Future of Pharma Is Incredibly Fast”). Other pharmaceutical companies, including Pfizer, the world’s largest, have invested in research to develop their own continuous-manufacturing technologies. But the success of the MIT collaboration suggests that Novartis may be the first to use it for production.

Moving from the batch method to the continuous method requires new kinds of reactions and equipment. While some segments of a batch process may themselves be called continuous because they are constantly running, the breakthrough in the MIT-Novartis collaboration is that each step of the process is fully integrated. The products of one reaction flow into the next, typically through small-volume tubes. This enables drugmakers to use certain kinds of chemical reactions that aren’t feasible in the large vats used in batch processing, such as those that require higher temperatures or that happen very rapidly. The method could bring new types of molecules into drug discovery.

Making the switch from batch to a fully integrated, continuous production meant that even the way a pill was formed had to be tweaked. The experimental system built at MIT was a jumble of wires, heaters, filters, mixers, and tubes, all enclosed in a 24-foot-long and eight-foot-wide clear plastic case. It could produce a drug that would typically have to be made in multiple facilities. At a few spots, technicians could reach in and adjust equipment or add material, but for the most part, the system was controlled by software that was fed details on temperature, pressure, and other reaction parameters by the many sensors that keep a close eye on the chemistry inside. The MIT system was made to produce one specific drug, but the researchers say the system is adaptable—different pieces of equipment could be swapped in to create a different final product.

The experimental plant at MIT has been dismantled, and the technology is now being further studied at the Novartis headquarters in Basel, Switzerland. The hope is that the continuous-manufacturing method would be more cost-effective. One benefit could be a significantly reduced time between issuing a manufacturing order for a product and having the finished drug in hand. This would be especially helpful during clinical trials, in which companies have to balance the need for sufficient drugs for upcoming trial stages with the risk that most of those drugs will end up failing. The faster production times promised by the continuous method—at least 10 times speedier in the MIT experimental facility—and the smaller scale of production would be much better suited to the uncertain nature of drug development.

The speedier manufacturing could also reduce the risk that pharmaceutical companies face when bringing a new drug to market. “When you launch a new drug, there’s often a lot of uncertainty in demand. Forecasting is very tough in the business,” says Gary Pisano, a Harvard Business School professor who specializes in life science manufacturing. “If you have a small amount of production and the [drug’s sales] takes off, then you are short, and ramping up will be slow. But if you’ve got a big plant for that drug and if it is not successful, then you are stranded,” he says.

The method could also reduce costs, because continuous facilities can be much smaller and require less energy and fewer raw materials. The smaller amounts of material used in continuous also demand more control over reactions, which, in the end, may ensure a higher quality final product. If you are running a batch process over time and end up with hundreds or thousands of gallons of a chemical at a certain step, you can in some sense “mix away your mistakes,” says MIT chemical engineer Richard Braatz. But the small volumes and fast reactions that typically occur in continuous pharmaceutical manufacturing require that higher product quality requirements be built into the design of the control system.

Yet despite all its benefits, it may be a struggle to bring this new method of drug manufacturing into widespread use. “People have talked a lot about the idea of continuous-flow manufacturing in pharmaceuticals but there’s not been much progress,” says Pisano. “A lot of companies were very conservative about trying anything radically new with their manufacturing,” he says. The batch method, while it has its shortcomings, was tried and true. “Finding a more efficient and effective way to do manufacturing was not high on the priority list,” says Pisano.

This resistance to change is also due to a lack of financial pressure. “For decades, these inefficiencies of batch processing have been masked by large margins earned by blockbuster drug sales, but now the pharmaceutical business model is changing,” says Salvatore Mascia, project manager for the Novartis-MIT Center for Continuous Manufacturing. “The combination of our new technologies with an end-to-end integration strategy will allow production of pharmaceuticals on-demand, with benefits in term of speed, quality, and cost,” he says. As revenues continue to decline for many companies and they move toward more targeted therapies with smaller markets, producers are showing interest in continuous manufacturing.

Allan Myerson, an MIT chemical engineering professor, says the drug industry’s engineers have long understood the potential efficiencies of continuous manufacturing, but never took it seriously because of the relatively small scale at which drugs are produced. “The difference in pharma is that they make so many different products,” says Myerson. “But there is much more economic pressure on pharma now to reduce manufacturing costs.” The MIT-Novartis collaboration demonstrated that companies could use the techniques of continuous manufacturing with only a small facility. “There’s a lot of potential financial as well as environmental benefits,” he says.

In addition to cost savings, continuous manufacturing could also provide benefits in manipulating the chemistry. Take, for example, the ability to use light-dependent reactions, which could give medicinal chemists more options of molecular structures to use when creating new candidate drugs. In batch processing, light cannot efficiently shine through the large volumes of material used, says Tim Jamison, an MIT chemist. The volumes of chemicals used in the team’s continuous system, however, are smaller and flow through tubes that enable a more even light exposure. Other kinds of reactions—those that produce dangerous chemical intermediates or that run very quickly, are more amenable to continuous. “One of the most exciting aspects is that this could open up new families of chemical structures that really aren’t viable currently and therefore expand treatments we have available for various diseases,” says Jamison.

The pilot facility was built to produce one particular compound. Now, the 11 MIT groups involved in the collaboration continue to find new reactions and tools so that other drug compounds can be produced in the automated, continuous-flow manner. “Traditionally, the industry has not been focused on manufacturing, but there’s a lot of momentum now,” says project director and MIT chemical engineer Bernhardt Trout. “We understand we have to make a long-term commitment to get this started.”

A Startup’s Small Batteries Reduce Buildings’ Electric Bills

Stem uses data analytics and large batteries to cut electricity costs in commercial buildings.

By Martin LaMonica on November 6, 2012

An energy startup called Stem has developed a battery for commercial buildings that’s clever enough to predict—based on the price of electricity—when to store power and when to release it. The market for the company’s technology is limited for now, but its product hints at how distributed energy storage and management could transform the grid.

A building’s owner can save money by analyzing energy-use trends and changing the thermostat settings in response. Dozens of companies are developing analytics software aimed at such efficiency improvements.

Stem, based in Millbrae, California, combines this “big data for buildings” technique with on-site energy storage. The batteries aren’t just intended as backup power. Instead, in concert with software, they’re part of a system designed to allow a building to use the cheapest form of power available at any given moment, whether that power comes from Stem’s batteries or from the grid.

The system uses algorithms adapted from the financial industry to predict a building’s power use on an hour-by-hour basis. The battery can start serving power the minute a higher price comes into effect—at times of peak use—or avoid the fees sometimes triggered by utilities when a building consumes too much power at a given time.

“This sort of thing was not possible five or six years ago. For every given site, we will literally run millions of simulations a day,” says Stem founder and executive vice president Brian Thompson, a former IT professional who developed high-volume e-commerce systems.

Stem’s batteries are stripped-down lithium-ion automotive batteries linked to power electronics designed to quickly switch between partially powering a building or charging from the grid. The bulk of the analytics are done over the Internet and sent to an on-site computer, which uses machine-learning techniques to improve its energy forecasts. The system can be bigger or smaller based on the number of batteries used, but can be as small as a small refrigerator or dishwasher.

A number of companies already use on-site power generation to capitalize on the difference between peak and off-peak rates. Stem, which was hatched at the Wharton Business School, had initially planned to combine rooftop solar power with batteries.

In California, commercial customers have a complex set of rates designed to lower peak-time demand. Sophisticated pricing structures make a price-optimizing system more worthwhile. With Stem’s system, Thompson says, customers don’t need to change their behavior and can still lower their bills between 5 and 15 percent.

Batteries located at buildings could also add storage to the grid. This would make electricity more reliable and allow for higher penetration of technologies like wind and solar, which produce power intermittently. Utilities have started to use batteries to buffer the grid, but they’re not in wide use because of the cost.

A network of smaller computerized batteries designed to lower a building’s energy bills could help, Thompson argues. He notes that a hotel with about 100 rooms can use a battery with a capacity between 50 and 100 kilowatt-hours, or two to four times bigger than the battery in the Nissan Leaf.

“As battery prices and the prices of computing and data and bandwidth drop, we’re going to see these types of devices in every building in the world, whether it’s in 20 or 30 years,” Thompson says. “These types of systems will allow us to move to a 100 percent renewable future.”

In the near term, the company has set its sights on California and states on the East Coast with similar electricity pricing schemes. Stem is pilot testing with small and medium-sized businesses in about 20 industries, and has deliberately chosen not to sell through slow-moving utilities.

But scaling up with direct sales could become a problem, says Jaideep Raje, an analyst at Lux Research. Energy startups usually need to partner with large corporations or utilities to get legitimacy and gain access to customers. “The smart money would say that as go the large utilities or large industrial players like Honeywell or Johnson Controls, so will go the industry,” he says.

Efficiency Breakthrough Promises Smartphones that Use Half the Power

A startup says it’s cracked a decades-old efficiency problem dogging wireless communications.

By David Talbot on October 31, 2012

Powering cellular base stations around the world will cost $36 billion this year—chewing through nearly 1 percent of all global electricity production. Much of this is wasted by a grossly inefficient piece of hardware: the power amplifier, a gadget that turns electricity into radio signals.

The versions of amplifiers within smartphones suffer similar problems. If you’ve noticed your phone getting warm and rapidly draining the battery when streaming video or sending large files, blame the power amplifiers. As with the versions in base stations, these chips waste more than 65 percent of their energy—and that’s why you sometimes need to charge your phone twice a day.

Now an MIT spinout company called Eta Devices, based in Cambridge, Massachusetts, cofounded by two MIT electrical engineering professors, Joel Dawson and David Perreault, say they have cracked the efficiency problem with a new amplifier design.

It’s currently a lab-bench technology, but if it proves itself in commercialization, which is expected to start in 2013—first targeting LTE base stations—the technology could slash base station energy use by half. Likewise, a chip-scale version of the technology, still in development, could double the battery life of smartphones.

“There really has been no significant advance in this area for years,” says Vanu Bose, founder of Vanu, a wireless technology startup. “If you get 30 to 35 percent efficiency with today’s amplifiers, you are doing really well. But they can more than double that.”

Power amplifiers use transistors that consume power in two basic modes: standby mode and output signal mode when sending out pulses of digital data. The only way to improve their efficiency is to use the lowest amount of standby power possible. But making sudden jumps from low-power standby mode to high-power output mode tends to distort signals, so existing technologies keep standby power levels high, wasting electricity.

“It means you are pulling a lot of energy just to keep the thing on,” says Dawson. And the more data you need to send, the worse it gets. “With high data rate communication, you wind up needing far more standby power than signal power. This is why the phone is warm,” he says.

The new advance is essentially a blazingly fast electronic gearbox. It chooses among different voltages that can be sent across the transistor, and selects the one that minimizes power consumption, and it does this as many as 20 million times per second. The company calls the technology asymmetric multilevel outphasing.

The problem they are attacking affects not only when you are literally transmitting something, but also when you are receiving. In the latter situation, the amplifier is busy as the device continually sends out messages confirming the receipt of packets—collections of bits that make up a unit of Internet communications—or alerting the network when packets are missing. “The transmitter is very active, even when you are downloading a YouTube video—not many consumers realize that,” Dawson says.

That is why reducing the communications involved in correcting for missing packets is the aim of another bandwidth-expanding and energy-saving technology, known as network coding (see “A Bandwidth Breakthrough”).

Eta Devices, funded by $6 million from Ray Stata, cofounder of Analog Devices, and his venture firm, Stata Venture Partners, is expected to formally launch its product in February at Mobile World Congress in Barcelona, Spain. The initial market will be in the developing world, where 640,000 diesel-powered generators are used to power base stations, chewing through $15 billion worth of fuel per year.

But the company is aiming at the massive smartphone market. It hopes that its work on a smartphone chip will ultimately lead to a single power amplifier that can handle all of the different modes and frequencies used by the various global standards, such as CDMA, GSM, and 4G/LTE. (Inside an iPhone 5, for example, there are currently five such chips.)

But the base station application is an important one by itself. In large base stations, the power amplifier typically takes 67 percent of the power, with another 11 percent for air-conditioning, or a total of 78 percent of electricity consumption. The new amplifier would reduce overall power consumption by half, says Mattias Astrom, the company’s CEO. As the global demand for data-rich communications surges, about a million new macro base stations are being deployed each year, most of them with LTE technology, he says.

The technology’s indirect savings could include eliminating air-conditioning in big base stations and reducing the size of backup power systems. “There are a lot of secondary effects that are really important,” says Astrom, whose last company, a mapping company called C3, was sold to Apple. While Eta Devices has been quiet in recent months, “now we are extremely confident about what we have here,” he says.

U.S. small businesses snap four months of job losses

WASHINGTON | Thu Nov 1, 2012 2:01pm EDT

(Reuters) – U.S. small business employment steadied in October after four straight months of job losses as manufacturing firms added workers to their payrolls, a survey showed on Thursday.

The National Federation of Independent Business said the net change in employment per firm edged up to 0.02 last month after declining 0.23 in September.

“Most of the industry groups were still slightly negative, but manufacturing employment growth was still strong and construction was slightly positive,” the NFIB said in a statement.

The survey was released ahead of the government’s more comprehensive payroll count on Friday. Nonfarm payrolls likely increased 125,0000 in October, according to a Reuters poll, after rising 114,000 in September.

The unemployment rate is seen inching up a 10th of a percentage point to 7.9 percent. The NFIB survey showed a marginal decline in the share of employers reporting difficulties filling job openings last month.

(Reporting by Lucia Mutikani; Editing by Leslie Adler)

Start-ups plan new ways to deal with future disasters

(Reuters) – Floating robots to gather storm data, fuel cells for power outages, and tools to choose evacuation routes and help responders stay connected to the Internet are among the innovations that increasingly will help responders deal with future disasters, start-up companies say.

These tools are helping track weather patterns and measure their strength, soften their impact and speed recovery. Many are already proving their worth, not just in massive storm Sandy but in other weather disasters such as Hurricane Isaac and this summer’s extreme drought in the U.S. Midwest.

VantagePoint-backed LiquidRobotics, for example, deploys floating robots to measure vast amounts of ocean data. Unlike passive buoys, its robots – called wave gliders – can be sent towards an oncoming storm. Unlike weather-tracking airplanes, the robots measure conditions at the ocean’s surface, providing key information about a storm’s future track and strength.

The wave gliders provide real-time information of tectonic plate movement, helping scientists determine if earthquakes will trigger tsunamis, for example. LiquidRobotics equipment is already deployed in some areas around the Pacific, including Indonesia and Japan, for this purpose.

Currently, LiquidRobotics works with some branches of the U.S. National Atmospheric and Oceanic Administration, including the agency’s tsunami warning center and buoy center. But there is room for more governmentbusiness, said Francois Leroy, senior vice president for science and international sales at the Sunnyvale, California-based company.

While he believes it is too early to approach officials in areas affected by Sandy to market the gliders, the company is planning its approach for a few weeks’ time. “We are collecting the data, preparing the case in terms of what we have done, and can do,” Leroy says.

Another data-intensive company that helps customers deal with weather conditions is San Francisco-based Climate Corp, backed by Index Ventures, Google Ventures and others. It uses myriad weather data points to price crop insurance.

INRIX, backed by August Capital and others, can measure highway traffic flow and help officials choose the best evacuation routes. Its clients include the I-95 Corridor Coalition, serving a major East Coast highway.

TRICK THE RESPONDERS’ SOFTWARE

Fuel cells manufacturers provide power when electric power lines go down. These firms include Bloom Energy, backed by Kleiner Perkins Caufield & Byers and others; and ClearEdge, backed by private-equity firm Kohlberg Ventures and others.

ClearEdge, based in Hillsboro, Oregon, sells its fuel-cell system starting in 5-kilowatt units that can power a 3,000 square foot home; the units can also be stacked to provide more power. They work by forcing natural gas through a chemical system to generate power – without burning the gas.

Because natural gas often keep flowing through pipelines even during power outages, ClearEdge’s systems could make a good choice for areas prone to electrical power failures, said Neal Starling, senior vice president for sales and marketing. Right now he is working on collecting information on how the natural gas grid held up during Sandy, to be used for sales pitches.

Seattle-based NetMotion Wireless helps officials respond to disaster by ensuring their networkedsoftware programs in key areas like dispatch and power-line location operate even as they move in and out of networks and wireless zones.

NetMotion’s mobile virtual private-network technology works by effectively tricking disaster-responders’ software into believing it is still connected to a network even when it is temporarily out of range, preventing it from crashing, losing data, and requiring reboots.

The company’s customers include power provider Con Edison and state police in Connecticut and New York, according to Paul Riebock, who handles federal government relations for NetMotion, backed by Clearlake Capital.

He hopes that what he says was a strong track record during Hurricane Sandy will help win more customers, but does not plan on making any sales calls just yet. “You wait a little for things to settle down a bit,” he said.

(Reporting by Sarah McBride; Editing by Phil Berlowitz)

Apple: ‘From a tech titan into a dinosaur’ — but not for a year

By  October 29, 2012: 11:51 AM ET

An analyst predicts the company’s demise, yet still rates it a “buy”

FORTUNE — Here’s a quote for Daring Fireball‘s claim chowder:

“As we have stated before on many occasions, Apple’s time to turn from a tech titan into a dinosaur will come, but we still think that we are at least a year away…” — Berenberg Bank’s Adnaan Ahmad

I suppose working for a company that’s been in the private banking business since 1590 gives one a certain historical perspective, but it’s hard to believe — not matter what he wrote on Friday — that Mr. Ahmad really thinks that Apple (AAPL) will face extinction in fiscal 2014.

The occasion for his note — issued the day after Apple’s Q4 2012 report — was the company’s gross margin guidance for Q1 2013: A surprisingly low 36%. Like most analysts, Ahmad is pretty sure that number will trend up over the next 12 months as the company’s production “learning curve” improves.

A bigger issue, he writes, are the margins on the iPad mini, which Apple has priced lower than it might otherwise if it weren’t trying to squeeze competitors like Amazon (AMZN) and Google (GOOG) who can’t seem to sell their 7-inch tablets for less than $199 without losing money on each sale.

But he believes the biggest worry for Apple — and the one that led to that remark about dinosaurs — is that day in the future when the company’s iPhone business begins to slow down:

“The concern is obviously that with a high teens market share in the smartphone space, its share gains (like this quarter) will tap out given the high price points. This quarter, for example, ASPs were $636, which compares to $55 ASPs at Nokia, $150 ASPs at Samsung and $200-250 ASPs at HTC and RIMM. So for Apple to gain wider share of the smartphone market, it will need to offer ‘customised’ products at lower price points (i.e. pre-paid iPhone), rather than continue with its current strategy of dropping price on a one- or two-year old product. This will have similar margin ramifications to those that the mini has had on the iPad segment, and if Apple decides not to have customised lower-end iPhones, then growth will well and truly tap out once the China Mobile deal is signed (likely in H213). Outside of Samsung playing a price war game against Apple, this slower growth trajectory of the iPhone business is Apple’s single biggest concern in our view.”

Despite his concerns, when it comes to making investment advice, Ahmad seems to want it both ways.

“The most-asked question we face,” he writes, “is which suppliers of Apple to own – our usual response is none! Look at it this way: if an 800lb gorilla is having some margin issues, how is its supply base going to fare?…

“Hence our negative bias to Qualcomm, Imagination, Hon Hai, Catcher, TPK, Foxconn Technology and Dialog. If you want to play the volume theme, you would play Apple, Samsung and ARM, all of which we rate as Buys.”

Go figure.

Home prices rise for fifth month in a row

NEW YORK (CNNMoney) — The housing market picked up more momentum in August, as the average home price for 20 major cities jumped 0.9%, according to the S&P/Case-Shiller home price index

The increase marked the fifth consecutive month of gains for the index with all but one city, Seattle, recording month-over-month price increases.

“The sustained good news in home prices over the past five months makes us optimistic for continued recovery in the housing market,” said David Blitzer, spokesman for S&P.

The Case-Shiller report is one of many gauges of housing market health that has turned upbeat in recent months. New and existing home sales have been stronger, inventory of homes for sale has fallen and developers have stepped up building activity.

Slow improvement in the national economy has also boosted the housing market, as have record low mortgage rates. The rates for a 30-year loan have stayed below 3.7% since May. Combined with home prices that are still about a third less than they were when they hit their peak, these record-low rates have made homebuying very affordable.

Of the cities S&P’s index covers, Phoenix has roared back the fastest, with a whopping 18.8% year-over-year gain in August. That marks the fourth month in a row of double-digit price hikes. Detroit prices rose 7.6% over the past 12 months and Miami’s grew 6.7%.

Mike Larson, a financial analyst with Weiss Research, remains cautious about the outsized gains in Phoenix and some Florida markets. Much of the return represents “a resurgence in investor demand,” he said. Investors now represent about 27% of the home purchases in the market, according to data from the National Association of Realtors.

Most of these buyers are looking to take advantage of beaten down prices so they can rent out the properties at a healthy profit, he said.

“The fly in the ointment is that these buyers lack emotional attachment,” said Larson. So unlike regular homeowners, they will likely not stick with the homes should the market head South again.

Among the three cities to have year-over-year losses, Atlanta recorded the biggest decrease in home values, with prices down 6.1%. New York was down 2.3% and Chicago fell 1.6%.

Rising prices are expected to continue, leading some economists to predict the housing market has finally turned a corner.

“Looking forward, price increases will continue,” said Jed Kolko, chief economist for Trulia. His company has more recent data, for September and October, that shows asking prices on homes have risen.

“Prices on Election Day will be almost the same as when Obama took office, probably just 1.7% below where they were in January 2009,” he said.

Microsoft Seeks an Edge in Analyzing Big Data

By 
Published: October 29, 2012

SEATTLE — Eric Horvitz joined Microsoft Research 20 years ago with a medical degree, a Ph.D. in computer science and no plans to stay. “I thought I’d be here six months,” he said.

He remained at M.S.R., as Microsoft’s advanced research arm is known, for the fast computers and the chance to work with a growing team of big brains interested in cutting-edge research. His goal was to build predictive software that could get continually smarter.

In a few months, Mr. Horvitz, 54, may get his long-awaited payoff: the advanced computing technologies he has spent decades working on are being incorporated into numerous Microsoft products.

Next year’s version of the Excel spreadsheet program, part of the Office suite of software, will be able to comb very large amounts of data. For example, it could scan 12 million Twitter posts and create charts to show which Oscar nominee was getting the most buzz.

A new version of Outlook, the e-mail program, is being tested that employs Mr. Horvitz’s machine-learning specialty to review users’ e-mail habits. It could be able to suggest whether a user wants to read each message that comes in.

Elsewhere, Microsoft’s machine-learning software will crawl internal corporate computer systems much the way the company’s Bing search engine crawls the Internet looking for Web sites and the links among them. The idea is to predict which software applications are most likely to fail when seemingly unrelated programs are tweaked.

If its new products work as advertised, Microsoft will find itself in a position it has not occupied for the last few years: relevant to where technology is going.

While researchers at M.S.R. helped develop Bing to compete with Google, the unit was widely viewed as a pretty playground where Bill Gates had indulged his flights of fancy. Now, it is beginning to put Microsoft close to the center of a number of new businesses, like algorithm stores and speech recognition services. “We have more data in many ways than Google,” said Qi Lu, who oversees search, online advertising and the MSN portal at Microsoft.

M.S.R. owes its increased prominence as much to the transformation of the computing industry as to its own hard work. The explosion of data from sensors, connected devices and powerful cloud computing centers has created the Big Data industry. Computers are needed to find patterns in the mountains of data produced each day.

“Everything in the world is generating data,” said David Smith, a senior analyst with Gartner, a technology research firm. “Microsoft has so many points of presence, with Windows, Internet Explorer, Skype, Bing and other things, that they could do a lot. Analyzing vast amounts of data could be a big business for them.”

Microsoft is hardly alone among old-line tech companies in injecting Big Data into its products. Later this year, Hewlett-Packard will showcase printers that connect to the Internet and store documents, which can later be searched for new information. I.B.M. has hired more than 400 mathematicians and statisticians to augment its software and consulting. Oracle and SAP, two of the largest suppliers of software to businesses, have their own machine-learning efforts.

In the long term, Microsoft hopes to combine even more machine learning with its cloud computing system, called Azure, to rent out data sets and algorithms so businesses can build their own prediction engines. The hope is that Microsoft may eventually sell services created by software, in addition to the software itself.

“Azure is a real threat to Amazon Web Services, Google and other cloud companies because of its installed base,” said Anthony Goldbloom, the founder of Kaggle, a predictive analytics company. “They have data from places like Bing and Xbox, and in Excel they have the world’s most widely used analysis software.”

Like other giants, Microsoft also has something that start-ups like Kaggle do not: immense amounts of money — $67 billion in cash and short-term investments at the end of the last quarter — and the ability to work for 10 years, or even 20, on a big project.

It has been a long trip for Microsoft researchers. M.S.R. employs 850 Ph.D.’s in 13 labs around the world. They work in more than 55 areas of computing, including algorithm theory, cryptography and computational biology.

Machine learning involves computers deriving meaning and making predictions from things like language, intentions and behavior. When search engines like Google or Bing offer “did you mean?” alternatives to a misspelled query, they are employing machine learning. Mr. Horvitz, now a distinguished scientist at M.S.R., uses machine learning to analyze 25,000 variables and predict hospital patients’ readmission risk. He has also used it to deduce the likelihood of traffic jams on a holiday when rain is expected.

Mr. Horvitz started making prototypes of the Outlook assistant about 15 years ago. He keeps digital records of every e-mail, appointment and phone call so the software can learn when his meetings might run long, or which message he should answer first.

“Major shifts depend on incremental changes,” he said.

At a retreat in March, 100 top Microsoft executives were told to think of new ways that machine learning could be used in their businesses.

“It’s exciting when the sales and marketing divisions start pulling harder than we can deliver,” Mr. Horvitz said. “Magic in the first go-round becomes expectation in the next.”