Skip to content

Earth Day and Computing

Growing up I was always interested in technology. Taking apart calculators was a favorite past time, and naturally as I got older I migrated towards a STEM career. In college I studied to be a Civil Engineer, a field that is constantly becoming more difficult to define. While my primary focus in school was on structural engineering, the program was gradually migrating towards an emphasis on environmental studies. I enrolled in a few environmental-related classes during my senior years in school and discovered, perhaps not surprisingly, that I was a fairly passionate environmentalist.

Since my graduation, the topic of global climate change has been researched and researched to the point where there is practically no dispute among 99.99% of the world’s scientists, yet I still routinely found myself arguing and advocating for it in conversations among my peers during car trips and lunch meetings. Climate change is a topic that is strangely virtually proven as scientific fact and at the same time met with skepticism by a large chunk of the population. I found it further disheartening that the people I was arguing with were people who I consider “scientific” in that they too have chosen STEM career paths. Luckily, the very nature of technological advancement has my back.

When it comes to increasing computing power and providing a better experience for users, efficiency is the name of the game. Whether that be something as simple as UI redesign or something more complex like a die shrink, increasing the efficiency with which tasks can be completed is the driving force behind technological advancement. Moore’s Law may be the quintessential illustration of this. From Wikipedia:

“Moore’s Law” is the observation that, over the history of computing hardware, the number of transistors in a dense integrated circuit has doubled approximately every two years.

It’s hard to say that Moore’s Law is the reason that the power we have at our fingertips has increased so greatly over the years. It started off as an observation that, in fact, the number of components (transistors) on chips was doubling every two years back in the 1960’s. Since then Moore’s Law has become more of a rallying cry or goal of the company Gordon Moore co-founded all of those years ago, Intel, and companies like it. The modern incarnation of Moore’s Law you might be most familiar with is Intel’s “Tick-tock” model, wherein the company will either shrink the process technology for their CPU dies – a “tick” – or redesign their microarchitecture – a “tock” – once every year to 18 months.

Shrinking the CPU die for a tick is a very important part of why virtually every single aspect of computing and related fields has become far more energy efficient and powerful over the past decade. A massive amount of effort is invested by industry leaders to constantly update the equipment used to manufacture the silicon microprocessors that go in to everything from CPUs to solar panels. This involves retrofitting enormous fabrication facilities (fabs) with new lithographic machinery that can “print” circuits at a smaller scale, fitting more components onto the same amount of space, or fitting the same number of components onto a significantly smaller space. It is an incredibly difficult process which involves introducing new techniques and materials to eliminate what is called “leakage”, or loss of energy due to poorly insulated circuits. The end result, however, are chips that can run faster on less energy.

Setting aside Moore’s Law, companies like Intel have dual motives for pushing this advancement: being able to print more full-sized CPU dies on a silicon wafer means a decrease in their production costs, and being able to alternatively fit more transistors into the same amount of space increases their ability to add features and performance to their products. The advancements made by Intel and other industry leaders trickle down to all chipmakers over time, and overall efficiency increases.

To the average consumer, this might only mean thinner smartphones with faster processors, better battery life, and higher-resolution displays. To gamers this means more frames per second without having to pay as much as before. To people like me, who are always keeping an eye on technological advancements and their effects on the world in a broad sense, it means less energy consumption, less waste, increased efficiencies across the board, and a healthier Earth.

Consider the energy industry, where there is an ongoing movement towards “greener” more sustainable forms of energy production. While advancements in the computing space can apply on a tangential level to virtually all forms of energy production (updated distribution systems and infrastructure), there are some fairly direct applications. For example, the IBM solar collector that recently garnered some attention from major industry news sources. This technology has been around for quite a long time actually, but increased efficiency of microprocessors, and their smaller size in this particular application, has led to significant advancements in the amount of energy that can be generated by these concentrators, making them more relevant.

The advancement of computing technology has had a far reaching positive impact on the environment. At CYBERPOWERPC, we do our very small part by offering the latest technology in our systems as soon as it becomes available. 4th Gen Intel processors, NVIDIA GeForce GTX 900 series graphics cards, SSDs, and some of the best 80 Plus certified power supplies help minimize the power consumption of high-end gaming systems. Please consider incorporating energy efficient components into your next build, whether you chose to build yourself of buy from us. For Earth!

Leave a Reply