FutureFive New Zealand - Consumer technology news & reviews from the future
Story image
Fri, 1st Aug 2008
FYI, this story is more than a year old

Computer chip maker Intel celebrates its 40th birthday on July 18th. New Zealand-born Graham Tucker has worked   for Intel for 20 years, so is in a unique position to comment on the technological revolution over that time. He also had   some insights into the future of computing

Why have you stayed with Intel for so long?Intel is at the heart of the industry. For a technologist, it is exciting to be involved in cutting edge technologies and  industry standards. Also, IT has plenty of variety, not only in terms of product, but the industry has evolved many times  – it’s like changing jobs constantly.

What projects are taking up most of your time at the moment?Firstly, there is a rapid transition to mobility in both Australia and New Zealand. There’s demand for versatile,  high-performance computing devices that can connect securely at any time to multiple communication protocols while  still maintaining battery life. This is challenging from a design and implementation standpoint. Secondly, there is the push  for energy-efficient computing. Governments and large companies know how important it is to reduce their carbon  footprint. It’s all about performance per watt. Intel is playing a big part in improving the energy efficiency of computers by making better transistors and using them more smartly. The transition to mobility is also assisting in the overall  efficiency of computing.

Is it the “people” side of the industry that excites you the most? Yes, I find IT people are interesting and enjoyable. The industry changes probably faster than any other industry. The  people don’t. People in all areas of the industry (engineering, sales, journalists, etc.) may move into different areas,  however most seem to be hooked on technology. It is one of the highlights to work with the variety of interesting characters for such a long time.

How does the computer knowledge base compare to when you started at Intel? In the early days, it was possible for a single person to have a very deep understanding of the complete solution - from  the microprocessor, through the system design, all the way through to how the application worked. These people were called geeks. This is no longer possible. There are now so many specialist areas and each subject can be very complex. Listening is the correct approach. I think you can gain knowledge by being humble and absorbing some of the expertise possessed by each individual.

Were you always interested in technology?At a very young age, I had an interest in electronics - soldering bulky transistors into various configurations. The  development of the microprocessor was an interesting approach for me. Moore’s Law of the transistor count on chips doubling every 18 months has held true for a very long time and will continue to do so.

You were born a Kiwi, so what made you move to Australia? Years ago Robert Muldoon made an interesting comment about Kiwis moving to Australia. He said it raised the IQ of  both countries! Prior to joining Intel 20 years ago, the Australian operation was managed by Americans who would  cycle every couple of years. In New Zealand, I used to interact with these Americans, at which time they decided that  they would employ “locals”.

What’s the most exciting project you’ve ever worked on at Intel? There have been many projects that were very exciting at the time, but were soon superseded by better technologies.  When the 386 microprocessor was introduced in 1989 at 25MHz, the industry could not understand how anyone  could ever fully utilise the architecture. When you look at technologies coming shortly from Intel like Nehalem [the  follow-up to Intel’s Core Duo series] and Larrabee [a graphics processing chip], there is a similar degree of  excitement.

How do you explain complex concepts simply?Sometimes the way technology is explained improves with many people getting involved to shape the message and get  creative in explaining complex principles. As an example, describing microprocessor architectures can be done with  animations and analogies that really stick in people’s minds.

What stands out about the early days of the Net?Before the public embraced the Internet, it was a tool that few people had access to and was therefore limited in  application. Modems were used for dial-up bulletin boards for many years and few CHATROOM computers had  Ethernet controllers. The architects of the TCP/IP protocol - which is the underlying principle of the Internet - had  amazing foresight, based on its ability to scale.

How much smaller and more powerful can microchips be made? What Intel does is very difficult. You can fit one thousand 45nm (nanometre) transistors across the diameter of the  human hair. The breakthrough our engineers had with the development of the 45nm process was very significant. For  20 years we were using silicon dioxide for the gate substance. In changing this to Hafnium and introducing new lithography techniques, we are now making good progress in achieving to 32nm and 22nm sizes. We think Moore’s  Law will hold true for some time to come.

What does the future hold for computers?Hardware is well ahead of software. We have repeatedly seen this pattern in the past. When there is enough computing  hardware out there with this extended capability, software eventually goes  through a revolutionary change.  Today computers are very process orientated. By that I mean we use them to do very repetitive tasks reliably. For many years we were squeezing relatively small improvements in performance with each iteration of single core devices.  In moving to dual and multi-core microchips, the capability is growing very quickly. Last year our research engineers  demonstrated an 80-core device. Although it is not in production, the potential is enormous. The potential is for  computers to start thinking for themselves; to start making informed decisions as humans do. One simple application of  this is data mining. Modern people are receiving massive amounts of information - more than we can organise. If you  have a digital camera you’re probably accumulating thousands of photographs – many more than what people were  gathering in the days of film. Imagine a search engine that uses facial recognition on all your images based on a name  you supply. Information overload is a problem we need to solve quickly. Visualisation is another area where the  industry could develop in leaps and bounds over the next decade.