ArtsAutosBooksBusinessEducationEntertainmentFamilyFashionFoodGamesGenderHealthHolidaysHomeHubPagesPersonal FinancePetsPoliticsReligionSportsTechnologyTravel

The Future Of Intel Processors - 2009 May Soon Seem The Stone Age

Updated on November 28, 2009

These completely innovative usage models and newly developed devices are accelerating the amount of traffic and the growth on the Internet. There are various indicators which show that the proliferation of Internet handheld devices is likely to explode in the next five years by a factor of eight. Of this entire infrastructure of handheld devices, the glue which will hold all of it together is a new generation of servers, which act as the backbone.

All of the utilizations and all of the devices must be supported by this next generation of servers. Not only do these servers support the applications infrastructure, but also the distribution software model and the revenue service model. In the next decade businesses will want to primarily leverage the vast amounts of data that have collected and dovetail this information with interactions with their customers in order to develop new models of doing business and accelerating and improving their bases of customer support.

The strongest challenge that has faced businesses is how to accomplish this in a fashion that is a scalable and affordable. The proliferation of data is growing at such a rate that many businesses are finding that scaling up their proprietary in-house data centers to keep pace with this exploding data volume is simply not affordable to them.

Intel has been working on two essential technologies which are fundamental to making that scaling to be able to meet data needs, practical and affordable for a significant number of businesses. The first of these fundamental infrastructures of technology is the cloud, and the other is virtualization. The cloud is the simplest and clearest way to expand market potential for data products. The providers of various computing services are enamoured of this form of technological implementation as it allows these companies to provide services and applications throughout their networks in ways that were not possible before. Large multinationals also appreciate this potential, as it allows these companies to derive value from the IT investments they have already made into their own proprietary cloud computing infrastructures.

Virtualization is also important as it is the enabling of a rich set of configurations and the speeding up of refresh cycles in the various data centers which is a major factor in being able to cope with the demands made by the exploding amount of data which will be seen in the next decade. Essentially, the Nehalem microprocessor family was designed for the server arena. Intel has implemented a common architecture, which is able to operate on the various products from the extremely low power server products that are embedded and can consume less than thirty watts all the way up to the EX and EP family of Nehalem which feature dual processors and multiple processors. These Nehalem EP and EX are able to tackle the most complex demands in computing today.

The hot topic in the past few years has been the convergence of personal computers, mobile devices and non-interactive entertainment forms such as television. However, that convergence has already happened... it has come and gone. The future is not about convergence, as most people will agree that the convergence has already occurred and is now in the past. The interactivity of various mobile and fixed devices in communications and computing allows engineers to create the Intel touted continuum where the overall variety of these devices are able to seamlessly inter-operate through the implementation of technology.

There is no doubt that the next decade will bring heretofore unimagined technological developments and we may look back on these days of 2009 as the Stone Age.

Back to Start


    0 of 8192 characters used
    Post Comment

    No comments yet.