Thursday, November 30, 2006

Lab on a Chip

here's a follow-up to the story I posted yesterday on Pfizer. I think this quote is bang on the money:

"The market for diagnostic equipment is evolving towards fully automated, cost-effective devices usable directly at the point of need."
- Maria Teresa Gatti, Director of Research and Innovation, Advanced System Technology, STMicroelectronics.

************

LONDON — STMicroelectronics has demonstrated a prototype device capable of selectively collecting and manipulating biological molecules which the chip group suggests opens the way to cost-effective automated sample preparation for medical and forensic diagnostics.

The system was built using a technology compatible with the MEMS technology that ST uses for its In-Check lab-on-chip devices.

The prototype chip contains a tiny channel, measuring about 1mm in length, 0.1mm in width and 50 microns in height which is filled with a solution containing the molecules of interest. On the bottom of the channel, an array of tiny platinum electrodes (25micron wide, separated by 25 microns) provides precise control over the pattern of the electric field in the channel and therefore the forces applied to the biological molecules.

ST says current biotechnological platforms such as its In-Check devices, work for the diagnosis of specific diseases or the monitoring of food and water for bacterial contaminants by allowing the rapid detection of particular genetic material in liquid biological samples. But the preparation of the samples is still a relatively time-consuming process performed with large samples in laboratories using techniques that require skilled technicians and are difficult and expensive to implement with smaller samples.

It adds the aim of its research program is to explore new methods to automate sample preparation, so that the biological molecules of interest could be rapidly extracted from "raw" specimens such as saliva, blood or biopsy tissues and used as the input to the lab-on-chip diagnostic stage.

"The market for diagnostic equipment is evolving towards fully automated, cost-effective devices usable directly at the point of need," said Maria Teresa Gatti, Director of Research and Innovation, Advanced System Technology, STMicroelectronics.

Details of the research project were unveiled at this week's NANOMEC06 Symposium on Materials Science & Materials Mechanics at the Nanoscale, held at the Politecnico di Bari, Italy in a paper presented by Marco Bianchessi, Sarah Burgarella and Anna Zocco from ST Advanced Systems Technology (AST) organization.

The project builds on a prior joint research project between ST and Evotec Technologies GmbH.

The technique used by the ST researchers is based on dielectrophoresis, where an electric field is used to separate biological particles contained in a conductive solution. The careful setting of physical and electrical factors allows precise control of the movement of target particles and researchers demonstrated that this could be exploited for practical uses.

Potential benefits include the ability to isolate cells that are present in low concentrations, to increase the concentration of cells in a solution and to extract DNA from the cell nucleus, as well as allowing sample preparation to be performed in the field by personnel with minimal training on the use of the devices.

Importantly, the researchers also successfully showed that by precisely controlling the voltage applied to different electrodes, cells could be collected at one specific region and then moved to other regions in either direction.

"Sample preparation technology, integrated with ST In-Check lab-on-chip platform, will allow us to build low-cost, easy-to-use systems that will enable diagnostic analyses to be performed outside specialized laboratories, e.g. directly in hospitals or even in the doctors office," noted Anton Hofmeister, Group VP and General Manager, Microfluidics Division, STMicroelectronics.

Wednesday, November 29, 2006

End of an Era

I wasn't suprised at all to see this announcement in today's Wall Street Journal:

"Pfizer Inc. signaled the end of an era in the drug industry by announcing plans to slash its domestic sales force by 20%, or more than 2,000 people. The deep cut by Pfizer, which has fielded the largest sales army for years, could lead to a broader retrenchment across the industry."


As I wrote in my book "Quantum Investing" the future of health care is in quantum-based diagnostics, nanotechnology, and personalized medicine.

Tuesday, November 28, 2006

Shift to larger TVs favors LCD over plasma

here's a good summary of what's going on in TV land. I haven't purchased a large flat panel LCD HD TV yet, but it looks like it might be wise to postpone buying one as prices continue to plummet...

*****************

Plasma TV suppliers such as Panasonic maker Matsushita Electric, already outnumbered by the rival LCD camp, are expected to lose further ground as LCD TVs encroach on the 40-inch-class market, a plasma stronghold.

Growing demand for higher-resolution models is also giving a leg up to liquid crystal display (LCD) TVs, promoted by Sony and many others in Taiwan and South Korea, paving the way for consolidation among plasma companies, analysts say.

It is technologically difficult and often costly for plasma makers to give a full high-definition function to models with a screen size of less than 50 inches, while LCD TV makers are aggressively promoting full HD models in that segment although prices are generally higher.

"This Christmas season probably is the last chance for (plasma TV makers) to promote 42-inch models. By this time next year probably there will be no price difference between plasma and LCD TVs," Credit Suisse analyst Wanli Wang said.

With little price difference, most people would choose LCD TVs because of their higher resolution, Wang said.

He expects LCD TV prices to fall 30 percent or more in 2007, compared with a decline of 15 percent to 20 percent for plasma TVs, due to ample LCD panel supplies.

Sharp in August started LCD production at its Kameyama No. 2 plant, the world's first to cut panels from eighth-generation glass substrates, which can yield eight 40-inch-class panels, compared with just three panels from the sixth-generation glass used at its first Kameyama plant.

Size matters
DisplaySearch forecasts that the plasma TV market will start shrinking in 2009 after hitting $24 billion in 2008, while it sees LCD TV demand reaching $75 billion in 2008 and $93 billion in 2010--a trend that will likely make companies offering both LCD and plasma lines think twice about their strategy.

Taiwan's Chunghwa Picture Tubes (CPT) is one such company. It shut down its plasma panel business this year to concentrate on LCDs.

"We cannot focus on two different products because of heavy capex (capital expenditure). That's why we had to choose one," CPT Chief Financial Officer James Wu said.

South Korea's Samsung Electronics and LG Electronics as well as Japan's Hitachi offer both LCD and plasma TVs. Matsushita also sells both products, although it heavily bets on plasma.

"The larger panels become, the more important response speeds for moving images are. In this point, plasma still excels," Matsushita President Fumio Ohtsubo told reporters last month.

CPT's Wu agrees that plasma panels, especially 50-inch and larger ones, do excel LCDs in some aspects of picture quality, but he says the sheer size of the LCD camp will help LCD panels overcome whatever drawbacks they have in a timely manner.

"Globally, so many companies, so many investments, so many people have been working in this area, on this product. So they can improve so quickly," Wu said.

About 80 percent of global flat screen R&D spending is being allocated to LCD panels, and the remaining 20 percent to plasma and some other technologies, Credit Suisse's Wang said.

Getting smaller
In a potential sign of slowing plasma TV demand, Japan's top three plasma TV makers--Matsushita, Hitachi and Pioneer--last month cut their unit sales forecasts by 8 to 20 percent for the year to March.

With the 40-inch-class market gradually taken over by LCD TVs, plasma models need to migrate to the market for 50-inch TVs and above, but demand is not as well developed there, analysts say.

"The United States accounts for more than 70 percent of demand for 50-inch plasma TVs and larger. In other words, there is virtually no 50-inch-class plasma TV market outside the United States," DisplaySearch director Hisakazu Torii said.

Although demand is limited, competition is not necessarily mild. Instead of LCD models, plasma TVs will be pitting themselves against another strong rival, rear-projection TVs.

"If you take a long-term view on the plasma industry, prices are coming down and revenue will not be growing that much. That makes aggressive investments for future growth difficult," iSuppli Japan director Junzo Masuda said.

"The number of players will likely be getting smaller and smaller," he said.

Boomer TV

As the co-author of the "Boomernomics," I had to smile when I saw an add for the recently launched
"Baby Boomer" TV network in Forbes magazine. Here's the link to the website, which is still under construction:

http://www.babyboomertv.net/

Bernanke on Productivity

Here's a passage from Fed Chairman Bernanke's speech today that I thought was insightful. Even though U.S. productivity growth - one of the most important economic indicators, if not the most important - has slowed in recent quarters, the longer run trend still looks promising...


That said, longer-run trends in the growth of productivity are very difficult to predict. During the first half of the decade, productivity in the nonfarm business sector increased at an unusually high average annual rate of about 3 percent. However, according to current estimates, productivity growth slowed in the second quarter of this year and came to a halt in the third quarter. Moreover, the strength of recent hiring raises the possibility of subpar productivity growth in the fourth quarter as well. When all is said and done, however, I expect that the latest numbers will turn out to have been a reflection of the typical volatility in the data and some cyclical response to the slowing in economic activity, not a signal of a sea change in the longer-run outlook for productivity growth.

21CN

Another communications milestone from the folks at British Telecom...


The first stage of a project to build one of the world's most advanced telephone networks has been completed.

The so-called 21st century network (21CN) is being built in the UK using Internet Protocol technology.

The massive upgrade, the first of its kind, will cost British Telecom £10bn and take until 2010 to complete.

It will open the way to new services as well as making existing services quicker and cheaper than before.

1,500 man years

The first person to use the new network was schoolgirl Laura Wess from South Wales.

The eleven-year-old spent a minute and a half chatting to the Right Reverend John Davies, the bishop of St Asaph.

She was chosen for the landmark call as she is one of the residents of Wick, near Cardiff, which is the first village to be upgraded to 21CN.

"Today marks a symbolic and momentous occasion for BT, the communications industry, for Wales and the rest of the UK as 21CN, over three years in the making, starts to become real for customers," said Paul Reynolds, chief executive of BT Wholesale.

BT has so far rebuilt around 10% of its network, laid more than 2,300 kilometres of new fibre optic cable in South Wales and invested more than 1,500 man years in developing the systems to support the new network.

Customers in Cardiff, Bridgend and Pontypridd will be the next to be transferred. The upgrade does not require customers to have a new telephone or number and can be done without an engineer visiting the premises.

Voice, data, broadband and multimedia services will all be carried on the new network. It will allow for faster broadband speeds as well as opening the door for services not yet thought of.

Mind Set

I've never been a fan of the idea of a "one world government." I came across this review of John Naisbitt's new book "Mind Set" in the Wall Street Journal and was delighted to see that we were kindred spirits on this topic. Here's the excerpt from the WSJ:


John Naisbitt has sold nine million copies of "Megatrends" (1982) and is often asked, as a result: "What is the next big thing?" Here he advises readers to adopt "mindsets" that will allow them to do their own prognosticating. Mindsets, he explains, are "how we receive information." The wife of a philanderer, he notes, filters information in a particular way: Her mindset is both a prism and a prison. Mr. Naisbitt illuminates 11 liberating mindsets -- such as "the future is embedded in the present" -- that may unleash one's inner clairvoyant or at least help one to see the present differently. He weaves his personal story into the advice-giving. Growing up amid the constraints of a Mormon household, Mr. Naisbitt says, he greatly benefited from an early application of thinking outside the box: An uncle treated a painful ear by blowing forbidden tobacco smoke into the raging orifice -- quite different and more effective, Mr. Naisbitt discovered, than the usual laying on of hands. Sensing truth and adventure beyond Utah, he joined the Marines and later worked for JFK, LBJ and IBM; he now teaches at Nanjing University in China. Mr. Naisbitt maintains that "economic domains" are replacing nation-states, and he offers a gloomy forecast for one-worlders: "Why would we add a world government at a time when we have been subtracting power from the hands of national governments through privatization and global communications?" As they may still say in Utah, hallelujah to that.


Hallelujah indeed!

Monday, November 27, 2006

Another Tech Milestone

A Penny Per MIPS!

The unofficial motto of high tech may be "smaller, cheaper, faster," but it's easy to forget how far we've come and how fast. In a post Sunday, Chris Anderson, of Long Tail fame, took note of a milestone in computing economics -- we have recently reached a consumer price on processing power of a penny per MIPS (million instructions per second). Intel's Core Duo running at 2.13 GHz costs around $200 at retail and can perform about 20,000 MIPS. "I remember my first 6 MHz 286 PC in 1982 that did 0.9 MIPS," Anderson writes. "I have no idea what the CPU cost then, but the PC it came in cost nearly $3,000 so it couldn't have been cheap. Say it was around $1,000/MIPS back then. Now it's $0.01/MIPS. I know I shouldn't be astounded by Moore's Law anymore, but that really is something."


Alec Saunders offers a few more data points:
• In 1977, Digital Equipment’s Vax 11/780 was a 1 MIPS minicomputer, and the Cray-1 supercomputer delivered blindingly fast execution at 150 MIPS.
• A 1999 era Pentium III/500 delivered 800 MIPS of processing power.
• A year later, in 2000, the Playstation 2 pumped out an astounding 6000 MIPS.
• Current embedded processors (like the PXA900 in [the] Blackberry Pearl, or the ARM 1136 in the Nokia N93 ...) are capable of 2000-era desktop processor speeds — in the range of 1000 MIPS, depending on battery consumption.

"It’s 2006 now." Saunders writes. "If the current trend holds true, and we can each carry 20,000 MIPS of processing power in the palms of our hands by 2012, what will we do with that power?"

We'll see!

On Turning Ford Around

Here's an excerpt of a Q&A with Harvard Business School professor Joseph L. Bower on Ford Motor Company that I think hits the nail right on the head.

************************

Q: Let me ask you one more question, and that is, putting on your expert strategist hat, Ford Motor Company's in trouble. It's losing a lot of money, in particular in North America. What's the most important thing it has to do to get back in the race, as it were, with a successful company like Toyota?

A: It has to make good cars. Basically, Ford has been holding its position on the basis of SUVs and the pick-up.

Q: Trucks.

A: Trucks, you could say. And really, Toyota, the Camry, just pushed the Mercury and the Taurus right out of the business, and Lexus took Lincoln down. And that takes you back into a decade anyway they've been in bad shape in the car business. They've got to rebuild their position in cars, and to do that they have to make great products. And they haven't done that in a while, and the consumer's on it.

Q: And I assume it's going to take a long—a certain amount of time, for them to start making great products again?

A: I would guess so. I'm not a car person, but yes. I think that's probably what appealed to them about Mulally because everybody was singing the praises of Airbus and then, five years later, Boeing has recaptured its leadership. So I'm sure that's the dream that they have at Ford.

***********************

here's the link to the entire interview:

http://hbswk.hbs.edu/item/5553.html

The Blind Side

Here's a link to a terrific interview with Michael Lewis, author of the recently published book "The Blind Side." I'm half way through "The Blind Side" and loving it.

http://money.cnn.com/magazines/fortune/fortune_archive/2006/10/30/8391798/index.htm

I especially enjoyed this remark from Lewis:

"The truth is that technological change is at the center of American prosperity."

Right on the mark, baby!

Gentlemen, Start Your (Electric) Engines!

Paul Boutin took the high tech Telsa for a spin recently and had this to say:

A week ago, I went for a spin in the fastest, most fun car I've ever ridden in—and that includes the Aston Martin I tried to buy once. I was so excited, in fact, that I decided to take a few days to calm down before writing about it. Well, my waiting period is over, I'm thinking rationally, and I'm still unbelievably stoked about the Tesla.


I've always marveled at how long the antique internal-combustion engine has survived. By 2006 standards, my car's power plant is a noisy, heat-blasting, poison-spewing monster with way too many moving parts. One spin in a Tesla made me realize that the gas engine might finally be on its last legs—and not because electric cars will help wean us from Saudi oil and save us from global warming. Rather, the Tesla Roadster is a rolling demo that proves electric cars now outperform their gas-guzzling counterparts in comfort, convenience, and, best of all, speed.

Here, here!

Tuesday, November 21, 2006

Kudos to "More Than You Know"

Kudos to our good friend Michael Mauboussin and his book "More Than You Know" for being selected by Strategy & Business as "The Best Business Book in Economics for 2006."

If you haven't read Michael's book yet, we highly recommend it. You can pick up a copy at link below:

http://www.amazon.com/More-Than-You-Know-Unconventional/dp/0231138709/sr=8-1/qid=1164115863/ref=pd_bbs_sr_1/002-4603730-3176851?ie=UTF8&s=books

Monday, November 20, 2006

Like the Air We Breathe

Here's an interesting quote from futurist Bruce Sterling:


"Reading the Pew [Internet & American Life Project] study, it becomes clear that we're entering a new era, the post-Internet age, a world in which the Net will be everywhere, like the air we breathe, and we'll take it for granted. It will be neither the glossy nirvana of technophillic dreams nor the dystopia of traditionalist nightmares. It will look a lot like today - but with higher contrast, sharper focus, and a wide-angle lens."

WIRED, 12/2006

Green Dreams

Here's are two quotes from an article published in The Economist recently. I'm not 100% sure about the first quote but I believe the second one is right on the money!


Clean energy now gobbles up almost a tenth of America's venture capital. After years of wondering what would be the next big thing after the dotcom boom, America's technology industry is betting on alternative energy.

Society should rejoice that greenery is in vogue. Markets too will over the long term come to value the technologies in which the clean-energy business is investing... Some investors are sure to see their shirts blown away in the wind.

Wednesday, November 15, 2006

The Greatest Money Manager of Our Time

FORTUNE recently published an article on my friends Bill Miller and Michael Mauboussin of Legg Mason Capital Management. It's a good read! Click on the link below to read it.


http://money.cnn.com/2006/11/14/magazines/fortune/Bill_miller.fortune/index.htm?postversion=2006111507

Tuesday, November 14, 2006

35 years of Intel chip design

The folks at ZNet have put together a nice piece showing how Intel's chip designs have evolved over the past three and one half decades.

35 years ago we had: The 4004 Microprocessor
This was Intel's first microprocessor. It sparked a technological revolution because it was the first product to fuse the essential elements of a programmable computer into a single chip.

The 4004 was designed to be a calculator component for a Japanese manufacturer, which initially owned all rights to the chip. At the time, most Intel executives saw little promise in the product. Since then, processors have allowed manufacturers to embed intelligence into PCs, elevators, air bags, cameras, cell phones, beepers, key chains and farm equipment, among other devices.

And tomorrow we will have:
A chip with 80 processing cores. The chip will be able to perform 1 trillion floating-point calculations per second, or 1 teraflop.


Check out the fascinating slideshow at link below.

http://content.zdnet.com/2346-9595_22-37087-1.html

The Evolving Web

Here's an interesting piece by John Markoff on how the web might evolve in coming years.

*************************

Entrepreneurs See a Web Guided by Common Sense
By JOHN MARKOFF

SAN FRANCISCO, Nov. 11 — From the billions of documents that form the World Wide Web and the links that weave them together, computer scientists and a growing collection of start-up companies are finding new ways to mine human intelligence.

Their goal is to add a layer of meaning on top of the existing Web that would make it less of a catalog and more of a guide — and even provide the foundation for systems that can reason in a human fashion. That level of artificial intelligence, with machines doing the thinking instead of simply following commands, has eluded researchers for more than half a century.

Referred to as Web 3.0, the effort is in its infancy, and the very idea has given rise to skeptics who have called it an unobtainable vision. But the underlying technologies are rapidly gaining adherents, at big companies like I.B.M. and Google as well as small ones. Their projects often center on simple, practical uses, from producing vacation recommendations to predicting the next hit song.

But in the future, more powerful systems could act as personal advisers in areas as diverse as financial planning, with an intelligent system mapping out a retirement plan for a couple, for instance, or educational consulting, with the Web helping a high school student identify the right college.

The projects aimed at creating Web 3.0 all take advantage of increasingly powerful computers that can quickly and completely scour the Web.

“I call it the World Wide Database,” said Nova Spivack, the founder of a start-up firm whose technology detects relationships between nuggets of information by mining the World Wide Web. “We are going from a Web of connected documents to a Web of connected data.”

Web 2.0, which describes the ability to seamlessly connect applications (like geographic mapping) and services (like photo-sharing) over the Internet, has in recent months become the focus of dot-com-style hype in Silicon Valley. But commercial interest in Web 3.0 — or the “semantic Web,” for the idea of adding meaning — is only now emerging.

The classic example of the Web 2.0 era is the “mash-up” — for example, connecting a rental-housing Web site with Google Maps to create a new, more useful service that automatically shows the location of each rental listing.

In contrast, the Holy Grail for developers of the semantic Web is to build a system that can give a reasonable and complete response to a simple question like: “I’m looking for a warm place to vacation and I have a budget of $3,000. Oh, and I have an 11-year-old child.”

Under today’s system, such a query can lead to hours of sifting — through lists of flights, hotel, car rentals — and the options are often at odds with one another. Under Web 3.0, the same search would ideally call up a complete vacation package that was planned as meticulously as if it had been assembled by a human travel agent.

How such systems will be built, and how soon they will begin providing meaningful answers, is now a matter of vigorous debate both among academic researchers and commercial technologists. Some are focused on creating a vast new structure to supplant the existing Web; others are developing pragmatic tools that extract meaning from the existing Web.

But all agree that if such systems emerge, they will instantly become more commercially valuable than today’s search engines, which return thousands or even millions of documents but as a rule do not answer questions directly.

Underscoring the potential of mining human knowledge is an extraordinarily profitable example: the basic technology that made Google possible, known as “Page Rank,” systematically exploits human knowledge and decisions about what is significant to order search results. (It interprets a link from one page to another as a “vote,” but votes cast by pages considered popular are weighted more heavily.)

Today researchers are pushing further. Mr. Spivack’s company, Radar Networks, for example, is one of several working to exploit the content of social computing sites, which allow users to collaborate in gathering and adding their thoughts to a wide array of content, from travel to movies.

Radar’s technology is based on a next-generation database system that stores associations, such as one person’s relationship to another (colleague, friend, brother), rather than specific items like text or numbers.

One example that hints at the potential of such systems is KnowItAll, a project by a group of University of Washington faculty members and students that has been financed by Google. One sample system created using the technology is Opine, which is designed to extract and aggregate user-posted information from product and review sites.

One demonstration project focusing on hotels “understands” concepts like room temperature, bed comfort and hotel price, and can distinguish between concepts like “great,” “almost great” and “mostly O.K.” to provide useful direct answers. Whereas today’s travel recommendation sites force people to weed through long lists of comments and observations left by others, the Web. 3.0 system would weigh and rank all of the comments and find, by cognitive deduction, just the right hotel for a particular user.

“The system will know that spotless is better than clean,” said Oren Etzioni, an artificial-intelligence researcher at the University of Washington who is a leader of the project. “There is the growing realization that text on the Web is a tremendous resource.”

In its current state, the Web is often described as being in the Lego phase, with all of its different parts capable of connecting to one another. Those who envision the next phase, Web 3.0, see it as an era when machines will start to do seemingly intelligent things.

Researchers and entrepreneurs say that while it is unlikely that there will be complete artificial-intelligence systems any time soon, if ever, the content of the Web is already growing more intelligent. Smart Webcams watch for intruders, while Web-based e-mail programs recognize dates and locations. Such programs, the researchers say, may signal the impending birth of Web 3.0.

“It’s a hot topic, and people haven’t realized this spooky thing about how much they are depending on A.I.,” said W. Daniel Hillis, a veteran artificial-intelligence researcher who founded Metaweb Technologies here last year.

Like Radar Networks, Metaweb is still not publicly describing what its service or product will be, though the company’s Web site states that Metaweb intends to “build a better infrastructure for the Web.”

“It is pretty clear that human knowledge is out there and more exposed to machines than it ever was before,” Mr. Hillis said.

Both Radar Networks and Metaweb have their roots in part in technology development done originally for the military and intelligence agencies. Early research financed by the National Security Agency, the Central Intelligence Agency and the Defense Advanced Research Projects Agency predated a pioneering call for a semantic Web made in 1999 by Tim Berners-Lee, the creator of the World Wide Web a decade earlier.

Intelligence agencies also helped underwrite the work of Doug Lenat, a computer scientist whose company, Cycorp of Austin, Tex., sells systems and services to the government and large corporations. For the last quarter-century Mr. Lenat has labored on an artificial-intelligence system named Cyc that he claimed would some day be able to answer questions posed in spoken or written language — and to reason.

Cyc was originally built by entering millions of common-sense facts that the computer system would “learn.” But in a lecture given at Google earlier this year, Mr. Lenat said, Cyc is now learning by mining the World Wide Web — a process that is part of how Web 3.0 is being built.

During his talk, he implied that Cyc is now capable of answering a sophisticated natural-language query like: “Which American city would be most vulnerable to an anthrax attack during summer?”

Separately, I.B.M. researchers say they are now routinely using a digital snapshot of the six billion documents that make up the non-pornographic World Wide Web to do survey research and answer questions for corporate customers on diverse topics, such as market research and corporate branding.

Daniel Gruhl, a staff scientist at I.B.M.’s Almaden Research Center in San Jose, Calif., said the data mining system, known as Web Fountain, has been used to determine the attitudes of young people on death for a insurance company and was able to choose between the terms “utility computing” and “grid computing,” for an I.B.M. branding effort.

“It turned out that only geeks liked the term ‘grid computing,’ ” he said.

I.B.M. has used the system to do market research for television networks on the popularity of shows by mining a popular online community site, he said. Additionally, by mining the “buzz” on college music Web sites, the researchers were able to predict songs that would hit the top of the pop charts in the next two weeks — a capability more impressive than today’s market research predictions.

There is debate over whether systems like Cyc will be the driving force behind Web 3.0 or whether intelligence will emerge in a more organic fashion, from technologies that systematically extract meaning from the existing Web. Those in the latter camp say they see early examples in services like del.icio.us and Flickr, the bookmarking and photo-sharing systems acquired by Yahoo, and Digg, a news service that relies on aggregating the opinions of readers to find stories of interest.

In Flickr, for example, users “tag” photos, making it simple to identify images in ways that have eluded scientists in the past.

“With Flickr you can find images that a computer could never find,” said Prabhakar Raghavan, head of research at Yahoo. “Something that defied us for 50 years suddenly became trivial. It wouldn’t have become trivial without the Web.”

Friday, November 10, 2006

Intel eyes nanotubes for future chip designs

Michael Kanellos at CNet.com notes that Intel has their eyes on carbon nanotubes.
Anyone that has read Quantum Investing will not be surprised at this story. :)

**************************

Intel is eyeing carbon nanotubes as a possible replacement for copper wires inside semiconductors, a switch that one day could eliminate some big problems for chipmakers.

The chip giant has managed to create prototype interconnects--microscopic metallic wires inside of chips that link transistors--out of carbon nanotubes and measure how well the interconnects perform. In essence, the experiments are a way to test whether the theories about the properties of carbon nanotubes are accurate.

Mike Mayberry, director of components research at Intel's labs in Oregon, will discuss the research at the International Symposium for the American Vacuum Society next week in San Francisco. Intel worked with California Institute of Technology, Columbia University, University of Illinois at Urbana-Champaign, and Portland State University on the project.

Chip interconnects have become a looming headache for chipmakers. Under Moore's Law, chipmakers shrink the components inside semiconductors every two years. Shrinking interconnects, however, increases electrical resistance, which in turn reduces performance. Chipmakers switched from aluminum to copper interconnects in the late 1990s to get around the problem. Unfortunately for Intel and other companies, the resistance will start to become a significant problem in smaller copper interconnects in the coming years.

"With metals, as you reduce the diameter of the interconnect, the resistance can go way up," said Dave Lammers, a director at VLSI Research, a semiconductor analysis firm. "The electrons carom off the metal atoms. That is going to slow things down."

Lammers first wrote about the experimental interconnects in The Chip Insider, VLSI's newsletter.

Carbon nanotubes, the reigning celebrity of the nanotechnology world, conduct electricity far better than metals. In fact, nanotubes exhibit what's called ballistic conductivity, which means that electrons are not scattered or impeded by obstacles.

Nanotubes, which measure only a few billionths of a meter thick, are also far thinner than metal interconnects can be made. Potentially, this eliminates the problem with shrinking interconnects. IBM and others have made transistors out of carbon nanotubes.

In its experiment, Intel aligned bundles of nanotubes by means of an electric field and then measured their frequency with fairly standard equipment.

There is, of course, a catch. Although they exhibit unusual and beneficial properties, carbon nanotubes are difficult to mass manufacture. Some nanotubes are semiconductors, meaning the transmission of electrons can be controlled, while others are pure conductors, depending on the arrangement of the atoms. Some are long; others are short. Nanotubes produced in the same batch will contain a dizzying array of characteristics.

Since each chip would require thousands of nanotubes for interconnects, researchers are going to have to figure out a way to produce uniform ones, or quickly separate the good ones from the chaff.

"With (contemporary) interconnects, you dig a trench and fill it up with metal," Lammers said.

As a result, carbon nanotube interconnects won't likely appear in a commercial chip for several years at best.

Whether carbon nanotubes make it into chips or not, the basic structures and materials inside semiconductors will change radically in the next two decades. Around 2010 or 2012, researchers will begin to narrow down what changes will have to occur and then chips that combine silicon elements with newer nano elements will likely begin to creep in toward the middle of that decade. In the 2020s, the ability to shrink silicon chips will likely end and necessitate a shift to very different materials.

Economic Realities

Anatole Kaletsky, who is a partner at GaveKal, hits on a very important point in a recent editorial printed in the London Times:

Particularly relevant to the contrast between Japan and Britain are two. First, manufactured goods, whose production is readily transferable to low-cost economies such as China, are falling relentlessly in price. Secondly, this process of outsourcing has created a new type of business, called “the platform company” by Charles Gave, the French economist (and my partner in an economic consulting business).

Platform companies sell everywhere but produce nowhere — businesses such as Dell, Nokia, Ikea, Glaxo, Apple or L’Oréal. Where, for example, are the factories owned by Ikea or Dell? They do not exist, because these companies subcontract almost all their manufacturing to other businesses, mostly in developing countries. Any business process can be divided into three stages — design, production and marketing — and platform companies have perceived that the relative value of these stages has fundamentally changed. In the 20th century, control over production was the key to business success. Today the other two stages add most value, because production can be shifted to subcontractors in developing countries that compete intensely to reduce costs.

This outsourcing is familiar enough, but its macroeconomic implications are less well understood. Because the manufacture of physical goods is the most volatile and capital-intensive part of the business process, outsourcing does not just transfer jobs and factories — platform companies also outsource to China and other developing countries much of the economic volatility that goes with capital investment, inventory cycles and the unionised factory employment.

At the macroeconomic level, therefore, the platform company model has produced several unexpected results. Large trade surpluses and high levels of investment, which used to be indicators of economic dynamism, may now be symptomatic of a country’s reluctance to integrate fully with the world economy and capitalise on the opportunities presented by free trade.

Wednesday, November 08, 2006

Schumpeter and Machiavelli

I came across this passage while re-reading Howard Rheingold's book "Smart Mobs." It's right on the money.

********************

"New technologies have a history of destroying the dominance of prior technologies or making them obsolete. Joseph Schumpeter claimed, “This process of Creative Destruction is the essential fact about capitalism.” Lawrence Lessig reminded me of Machiavelli’s counterpoint to Schumpeter: “Innovation makes enemies of all those who prospered under the old regime, and only lukewarm support is forthcoming from those who would prosper under the new.” Those who created an infrastructure in which the devices (telephones, televisions, and radios) are inexpensive and dumb, the network that connects the devices is highly specialized and expensive to install, and the service is sold on a metered basis (telephony, cable TV, and wired Internet access) are challenged by new enterprises in which cheap devices are the network, and no private enterprise owns the medium that carries their messages. The old telecommunications regime, if it is to survive, must either block challenging innovations politically, acquire the companies that challenge them, or change into different kinds of enterprises themselves. The market and the consumer have no obligation to remain loyal to obsolete technologies when something better comes along; just because Western Union had a large investment in telegraphy doesn’t mean that telephony should have been prevented through regulation or legislation.”

Monday, November 06, 2006

Powered by the Sun

I came across this story on WIRED.com's website. It's a good illustration of what happens when prices (in this case, energy prices) rise.

********************************

In a world where sun-powered garden lights seem like a nifty idea, new technologies touted by solar energy startups sound very far out.

Entrepreneurs promise that soon solar-energized "power plastic" will radically extend the battery life of laptops and cell phones. Ultra-cheap printed solar cells will enable construction of huge power-generating facilities at a fraction of today's costs. And technologies to integrate solar power-generation capability into building materials will herald a new era of energy-efficient construction.

Those are ambitious goals for a technology famous for powering pocket calculators, but investors are paying heed. This year, solar startups have snapped up more than $100 million in venture capital to develop printable materials capable of converting sunlight into electrical power. Soaring energy demand, as well as short supplies of polysilicon, a key ingredient in most solar cells, is fueling interest in alternative materials.

"These technologies look incredibly more real than they did five years ago," said Dan Kammen, founding director of the Renewable and Appropriate Energy Laboratory at the University of California at Berkeley. Kammen predicts solar sources, which today produce less than 1 percent of power consumed nationwide, could eventually meet one-fifth of U.S. energy demand.

Printed solar cells, produced with conductive metals and organic polymers in place of silicon, could help. As early as next year, startups plan to begin manufacturing printed solar products for use in power-generating facilities, rooftop installations and portable gadgets. While industry experts don't expect manufacturing on a massive scale to be viable for years, production capability is ramping up quickly.

Executives at Nanosolar, based in Palo Alto, California, plan to finish building a factory next year to churn out thin-film solar cells using copper-based semiconductors instead of silicon.

"Silicon models are too expensive in the first place," said Martin Roscheisen, Nanosolar's CEO, who expects the company will be able to build a 400-megawatt plant for about $100 million. Providing equivalent capacity using silicon technology, Roscheisen estimated, would cost close to $1 billion.

When Nanosolar's products become commercially available, Roscheisen plans to warranty the cells for 25 years -- similar to silicon solar products.

Miasolé, in neighboring Santa Clara, California, has developed a competing thin-film photovoltaic cell using a layer of photoactive material containing a compound called CIGS. The company plans to incorporate the technology into building materials and rooftop solar installations.

On the shorter end of the power-generation life cycle, Konarka, a startup in Lowell, Massachusetts, has agreements in place with manufacturers to produce a printed "power plastic" to supply solar energy for portable devices.

"When people think of solar, they think of rooftop, grid-connected. We're trying to change that mindset," said Daniel Patrick McGahn, Konarka's chief marketing officer. Unlike silicon-based solar cells used on rooftops today, Konarka's specialized plastics typically last years, but not decades. The company is marketing its technology for use in products with similar life spans.

While research into printed photovoltaic technologies dates back decades, progress on non-silicon applications has accelerated in recent years due to the shortage of polysilicon, said Travis Bradford, president of the Prometheus Institute for Sustainable Development in Cambridge, Massachusetts. Today, nearly 95 percent of solar cells use semiconductor-grade silicon, he estimates, but that should drop to around 80 percent over the next few years.

To compete against silicon solar manufacturers, Bradford says developers of new technologies will need to show that they can be cost-effective. They'll also have to prove supplies of core materials are adequate for mass production and demonstrate that their products don't degrade too quickly. While he's optimistic about the prospects, he's not convinced any technology is meeting all the criteria today.

"It takes a lot longer and a lot more money to commercialize technology than people think ... which is why crystalline silicon has been around for so long," he said.

Still, printed photovoltaics could soon be ready for commercial use, said Raghu Das, CEO of research firm IDTechEx. The key hurdle remaining is to make materials resilient enough to last for years. Das expects manufacturers to resolve those concerns and produce viable printed photovoltaics in 2009 or 2010. He envisions large-scale deployment around 2012.

In the meantime, solar startups entice investors with visions of clean, low-cost, energy-generating capability bundled into a range of products, from building materials to cell phones. While that vision may eventually prove realistic, says Das, it's still quite futuristic.

"As plastics are used to make this and not silicon, it will be incredibly low-cost -- you could compare it to the cost of printing ink on paper," he said. "However, if it was ready today, everybody would be doing it."

Our Brave New World

Here's a quote I came across while reading a book titled "Our Brave New World" by my friends at GaveKal Research. It hits the nail right on the head!

"When the process of creative destruction is allowed to work, we get both income disparity and the ability of people to 'move up.' When income disparity is constrained, the ability of people to climb the social ladder disappears." This is why, in large parts of Europe, 'l'ascenceur social est en panne.'

Getting the LED Out

Here's a story from Michael Kanellos at CNet.com on how LED's could start replacing lightbulbs soon.

*******************

SAN JOSE, Calif.--Light-emitting diodes will become economically attractive as replacements for conventional lightbulbs in about two years, a shift that could pave the way for massive electricity conservation, according to a researcher.

Right now, consumers and businesses can buy a light-emitting diode, or LED, that provides about the same level of illumination as an energy-hogging conventional 60-watt lightbulb, Steven DenBaars, a professor of material science at the University of California Santa Barbara, said at the SEMI NanoForum, taking place here this week. A principal advantage of the LED: It lasts about 100,000 hours, far longer than the conventional filament bulb
Lumileds' LED tech

Unfortunately, the LEDs that can perform this task cost about $60, he said. (Prices vary on the Internet.) But prices have been declining by 50 percent a year, so two years from now the same LED should cost around $20.

"At $20 the payback in energy occurs in about a year," DenBaars said. The rapid return on investment will occur in places such as stores and warehouses, where the light is on through much of the day. A year after that, LEDs will be even more economical for more places as costs continue to decline.

Approximately 22 percent of the electricity consumed in the United States goes toward lighting, according to the U.S. Department of Energy.

To make matters worse, traditional lightbulbs are incredibly inefficient. Only about 5 percent of the energy that goes into them turns into light. The majority gets dissipated as heat.

If 25 percent of the lightbulbs in the U.S. were converted to LEDs putting out 150 lumens per watt (higher than the commercial standard now), the U.S. as a whole could save $115 billion in utility costs, cumulatively, by 2025, said DenBaars, and it would alleviate the need to build 133 new coal-burning power stations.

In turn, carbon emissions in the atmosphere would go down by 258 million metric tons.

"Multiply that by three and you get the worldwide savings," he stated. DenBaars then showed a picture of the globe at night. The landmass of the U.S. could easily be picked out by nighttime lights.

"We shoot a lot of light into space that doesn't need to be there," he noted.

Rising prices of electricity, combined with the antiquated nature of lightbulb technology, has prompted several start-ups and large industrial concerns to get into lighting.

Fiberstars, for instance, has come up with a way to replace hot fluorescent tube lights with light-emitting optical fiber in freezer cases in grocery stores. Hewlett-Packard spinoff Lumileds is also producing LEDs for a variety of applications.

LED technology is improving as well. UCSB has created an experimental LED that can put out 117 lumens per watt, while a Japanese company has developed one that can put out 130 lumens per watt.

Getting LEDs to produce white light that is tolerable to humans has also greatly improved. Manufacturers can do it two ways. One is to package red, green and blue LEDs in a way that the combined light shines white to the human eye. The other way is to make blue LEDs and coat them with a phosphor--a luminescent substance commonly used on fluorescent lamps.

Friday, November 03, 2006

We All Have Tough Years, Folks!

Here's an article from Morningstar.com on Bill Miller that arrived in my inbox today. For those folks out there that are invested in Bill's funds, here's a little advice for you:

Don't do anything foolish with your money. Bill is a much better investor than you will ever be.

sw

Bill Miller's Streak Might End. Horrors!
By Greg Carlson | 10-31-06 | 06:00 AM
Morningstar.com

After beating the S&P 500 Index for a remarkable 15 consecutive years, Bill Miller of Legg Mason Value LMVTX is lagging the benchmark by nearly 11 percentage points for the year to date through Oct. 26, 2006. The extent to which the fund trails the S&P--which in turn has taken a toll on its longer-term comparisons with the benchmark--has some investors spooked. For example, after my colleague Russ Kinnel wrote a recent column that placed Miller among the 10 best current mutual fund managers, he received an e-mail from a reader who wrote that Miller, due to his bout of severe underperformance, is no longer among the industry's elite.

We beg to differ, of course. True, it does look increasingly likely that Miller's streak of beating the S&P will end this year, for although the fund has had strong fourth-quarter returns at times (due to holdings that thrive during the holidays, such as Amazon.com AMZN ), the current gap will be difficult to overcome in just two months. But we don't think the end of the streak would diminish the fund's attractiveness in the least.

In a way, it might be something of a positive if it helps investors better understand the fund's volatile nature (which the streak has masked). Miller runs a concentrated portfolio, takes big stakes in both racy fare such as Google GOOG and turnaround situations such as Eastman Kodak EK , and is willing to hang on to his picks through sharp downturns, so the fund's returns have long been among the most turbulent in the large-blend category. Earlier this year, Miller himself called his winning streak an "accident of the calendar," and it's easy to see why. Since the streak began in 1991, the fund has lagged the S&P in 47 of 178 rolling 12-month periods--more than 26% of the time (and by double-digit percentage points at times in the past). Furthermore, prior to the fund's recent struggles--and despite its consistency during calendar years--it trailed the index over several three-year rolling periods.

Putting The Streak In Perspective
The potential end of Miller's streak brings up a larger point. Even the very best managers tend to underperform, often for extended stretches. For example, two large-blend funds have outpaced or matched Legg Mason Value's return since the start of 1991 with the same lead manager at the helm: Longleaf Partners LLPFX and Vanguard Primecap VPMCX . The former lagged the S&P 500 in seven of the past 15 calendar years, including five in a row in the late 1990s' bull market. And Vanguard Primecap trailed the index in five of those 15 years. The cause of this underperformance is clear. In order to beat the benchmark decisively, a manager has to be willing to build a portfolio that looks significantly different than the index and stick to his or her guns when that style is out of favor. (It's no coincidence that all three funds have had consistently low portfolio turnover.) The Longleaf fund typically owns just 20 stocks and will let cash build when appealing ideas are scarce, while the Primecap team tends to hold big stakes in its favorite sectors, particularly tech.

For his part, Miller certainly isn't shy about deviating from the benchmark. Of Legg Mason Value's 44 holdings, 36 are constituents of the S&P, but they comprise less than a fifth of that capitalization-weighted index. Five of the fund's top 10 holdings-- Sprint Nextel S , utility concern AES AES , telecom provider Qwest Communications International Q , Sears Holdings SHLD , and Amazon--consume nearly a fourth of the fund's assets but less than 1% of the index. Furthermore, the fund has no holdings within the surging energy sector, a big reason why it merely squeaked by the index in 2004 and 2005. As a result of all these characteristics, the fund's returns are less correlated with those of the index than the vast majority of its category peers'.

Looking Ahead
Given the fund's atypical look and the generally uneven return profile of the best-performing funds, this fund could very well underperform the index beyond 2006; Miller's picks can sometimes take years to work out. But although his approach should normally lead to significant dry spells, it lends this fund tremendous appeal. Miller and his analyst team meticulously research each firm before purchase, ignoring most short-term metrics and focusing on long-term value. Miller is a fearless contrarian; if one of the fund's holdings gets hammered but the fundamental case for it hasn't changed much (if at all), he'll eagerly scoop up more shares. That tack has often resulted in bursts of superb performance. For all the hand-wringing about the fund's wide performance gap versus the index this year, it's worth noting that the fund beat the S&P by an even larger margin in 1996, 1998, and 2003. Ultimately, the best thing the fund's shareholders can do is ignore its short-term gyrations. They're better off worrying about how they're going to eat or dispose of all that leftover Halloween candy.

Nantero on the Move

Looks like the researchers at Nantero Inc. are on to something potentially very big.

************************

Nantero announces routine use of nanotubes in production CMOS fabs

Nov. 3. 2006 -- Nantero Inc., a Woburn, Mass., company using carbon nanotubes for the development of next-generation semiconductor devices, announced it has resolved the major obstacles that had been preventing carbon nanotubes from being used in mass production in semiconductor fabs.

Nanotubes are widely acknowledged to hold great promise for the future of semiconductors, but most experts had predicted it would take a decade or two before they would become a viable material. This was due to several historic obstacles that prevented their use, including a previous inability to position them reliably across entire silicon wafers and contamination previously mixed with the nanotubes that made the nanotube material incompatible with semiconductor fabs.

Nantero announced it has developed a method for positioning carbon nanotubes reliably on a large scale by treating them as a fabric which can be deposited using methods such as spincoating, and then patterned using lithography and etching. The company said it has been issued patents on all the steps in the process, as well as on the article of the carbon nanotube fabric itself, US Patent No. 6,706,402, "Nanotube Films and Articles," by the U.S. Patent and Trademark Office.

The patent relates to the article of a carbon nanotube film comprised of a conductive fabric of carbon nanotubes deposited on a surface. Nantero has also developed a method for purifying carbon nanotubes to the standards required for use in a production semiconductor fab, which means consistently containing less than 25 parts per billion of any metal contamination.

With these innovations, Nantero has become the first company in the world to introduce and use carbon nanotubes in mass production semiconductor fabs.

The company is developing NRAM -- a high-density nonvolatile random access memory device intended for use as a universal memory. The company says it can be manufactured both as standalone devices and as embedded memory in application- specific devices such as ASICs and microcontrollers.

Wednesday, November 01, 2006

The Secret to Investing Success

I came across this passage in an article by business strategist Nikos Mourkogiannis and had to laugh.

"Warren Buffett wanted to be an excellent investor - which meant being a rational investor. He knew that the best way to achieve this was by staying as far away as possible from Wall Street."

Indeed!

Here's the link to Nikos' article:


http://www.strategy-business.com/press/enewsarticle/enews102606?pg=0