Regardless of which hardware platform you currently use in your data centre, or whether you subscribe to software applications delivered as a service, or whether your core IT processing is done in New York or Bangalore, there’s one thing that you can’t do without – electricity. It is perhaps the thing that IT people think least about on a daily basis, yet it is without a doubt the single most important requirement for computing. IT may, however, need to pay a little more attention to the workings of the electricity grid in the future, if proposed changes, such as the smart grid, come to pass.
Ironically, the electricity infrastructure we rely on to power all the IT smarts is itself fairly ‘dumb’, relying on electro-mechanical devices for metering, and still using an almost 120-year-old, highly centralised ‘push model’ for generation and distribution of high voltage currents.
Meanwhile, the fuel source for the bulk of the world’s (and the EU’s) electricity generation is a form of fossil fuel, the major disadvantage of which is the emission of CO2, SO2, NOx and a range of other pollutants when burnt. The resulting greenhouse gases (GHG) contribute to climate change, quite possibly the greatest environmental and economic challenge facing humanity.
Across the EU-27 countries, public electricity generation accounts for approximately one third of all CO2 emissions, making the sector the largest single source of the pollutanti. For this reason, along with the catalyst that much of Europe’s existing generation capacity is nearing the end of its lifespan, attention has turned to the use of renewable energy sources such as wind, tidal power, solar and others, including nuclear.
Such generation methods aren’t without problems. With the exception of large wind farms (perhaps located offshore) and hydro-powered plants, renewable power generation tends to have much smaller electricity output capacity than coal, nuclear and gas plants. The other issue is the ‘intermittency’ factor, meaning the inability of a single power plant to deliver electricity continuously.
To state the obvious, wind farms need wind, and photovoltaics (PV) need sun. However while a single wind farm may be experiencing a still day, it’s always blowing somewhere and so the overall intermittency factor of highly distributed renewable sourced generation is considered to be suitable for reliable bulk generation. Therefore associated with considerations around a shift in energy sources is the further consideration of a move from highly centralised power generation to a largely de-centralised power generation; including electricity generation capability being co-located with the consuming facility. An electricity wind farm out the back of the data centre if you will.
Ignorance is bliss
Electricity consumers meanwhile aren’t provided very detailed information regarding their energy consumption. Whilst consumers are provided accurate data regarding the total amount of electricity consumed via the meter and associated billing data, they aren’t provided with any information regarding which devices consumed how much electricity.
Contrast that with the minute level of detail we are used to getting from network systems, security, and performance tuning tools in the IT world, and you begin to appreciate the degree of blindness we accept with regards to accurately measuring and managing our electricity consumption. This is already a real problem for facilities managers and data centre operators as they strive for efficient use of equipment, especially in the current economic circumstance.
Identifying potential efficiency gains is hampered by the lack of granularity in the data pertaining to electricity consumption. This problem will only get worse as companies need to adhere to legislation such as the UK’s Carbon Reduction Commitment, which requires companies to pay for emissions starting in 2010 and thus will drive further attention toward why and where power is consumedii.
The answer to both the problems of delivering reliable renewable-sourced electricity and the paucity of granular consumption data lie in IT-enabled electricity ‘smart grids’. Smart grids have been the subject of much study by electricity generation experts, with the EU supporting further investigation through its Sixth Framework Programme (FP6) round of innovation fundingiii. In general, the consensus is that smart grids are the way forward, and given the impending end-of-life for existing power stations, as well as efforts to drastically reduce emissions in the light of climate change, we are likely to see their introduction sooner rather than later.
The introduction of smart grids is something that IT professionals need to keep an eye on. For a start, what makes a grid ‘smart’ is embedded IT and generally a lot of it.
For example, next generation electricity meters won’t just measure usage, they will play an active role in managing consumption at a facility and device level.
Such management will enable distributed ‘micro generation’ power sources to coordinate with each other such that the intermittency problem of any single generation source is compensated for by the latent generation capacity of another. A facility may have an on-site wind turbine as well as a grid connection, with smart meters controlling whether the resulting energy is locally consumed, or contributed up to the grid, or whether supply needs to be sourced from the grid.
Individual devices that consume power may also be requested to throttle back their requirements by entering a lower power consumption state – trading off performance against power needs. Such capabilities will only be enabled if the embedded smarts in the grid are able to communicate with each other over a LAN/WAN (at a facility level) and across the internet (enabling cooperation between micro generation facilities).
Clearly, whether a facility or a device is able to throttle back or power down will be controlled by policy, and will be an aspect of service contracts between utilities and consumers in the future. Policy is all well and good, however we must also be mindful of the fact that our future electricity grids will include highly functional, network-addressable IT devices that have the capability to manage device and facility consumption. The potential for nefarious activity − whether motivated by fraud or mischief − is well recognised even by the most ardent smart grid proponentiv.
It is important (as always) to avoid hyping-up any potential security risk. That said, information security professionals need to be aware that the workings of the most basic IT resource – electricity supply – is changing in a manner that introduces a far larger and remotely addressable attack surface married with the tempting opportunity for mischief and monetary gain. CSOs need to be working with their facilities management colleagues to manage risk, as efforts to reduce GHG emissions introduces change to the nature of our electricity grids.