The Reputable Salesperson Versus the Tech Advisor

Vendors and consulting companies selling systems or projects are there to win the sale.  Reputable salespeople do adhere to ethical standards.  They won’t sell you an $80,000 system when they know that one for $15,000 will do the job.  They will present options and not try to force customers into choices they really don’t want.  They don’t hide information about long-term costs and maintenance.  They know and believe in their product lines, and if a question comes up that they can’t answer, they find out.  To some extent, they are there to assist and support.  All these things are true, of course, of only the best and most reputable salespeople.

However, it is neither the job nor the duty of the salesperson to delve deeply into an organization’s requirements and interests, to look at alternative solutions, and to make the optimal human connections around the system.  They are not obligated to represent their competition or to have a thorough cognizance of the industry.  Although they may explain why their solutions are better than the competition’s, they probably won’t highlight all the unique advantages in those alternatives.  They may not even know what those are.

Salespeople, no matter how reputable, are not being paid to do the due diligence for the organization.  They are obliged to ensure a fit to the best of their ability, but they may not have the knowledge to assess the suitability of a product or system.  Sometimes organizations make decisions that bind their vendor searches.  Perhaps they have decided on SAP or Oracle in advance, and they then search for a good vendor, who will happily oblige.  However, have they really determined that SAP or Oracle will meet their needs in a cost-effective way?  Have they carefully considered what other types of systems they might need in conjunction with what they are planning?  Generally vendors will not try to dissuade customers from buying their product lines.  Also, when an organization is desperate for a fix to a nagging problem, it may make rash decisions it later comes to regret.

Tech advisors maintain a vendor-neutral stance, an open-ended perspective, and an insistence on due diligence.  Therefore they can be the advocates, whereas vendors and solutions providers cannot.

Copyright © 2011-2012 Patrick D. Russell
Posted in ARCHIVE (Posts prior to 2013), Due Diligence, Tech Advisor/Advocate | Leave a comment

Are You Really Getting Your Money’s Worth?

In software systems, as in many other areas of business and life, there is the old maxim, “You get what you pay for.”  This maxim wisely advises us not to “cheap out” – to try to get by with less than what we need to do the job.  It also warns us of the costs and consequences of making such a choice.

Getting by on the cheap may be a bad plan, but paying a premium price is no guarantee that a system will meet expectations.  My book, The Tech Advisor, gives numerous examples of paying too much for a system and failing to get the value expected.  In fact, the price you pay to get the same approximate value may differ by orders of magnitude!  While the book provides many examples, I will highlight a few of the major ones in this blog.

What are some things to look for, when you want to identify what is really cost-effective?

Make sure your new software is really an upgrade.

All too often organizations spend a great deal of money (sometimes millions) on new software that really doesn’t end up working well.  To get an idea of this, consider how you sometimes feel as an end-user, when one of your favorite websites undergoes a facelift, and features you rely on are removed or made very difficult to access.  You know the company spent a great deal of time and money on it, but why didn’t they ask you what was important?

Why does this happen?  For a number of reasons – but here are three likely candidates:

  • The system was designed without adequate representation of all the affected constituents, both inside and outside the organization.  An example given in the book was some “upgraded” hotel check-in software that caused numerous problems for the desk clerks resulting in long lines at the counters.  The needs of neither the desk clerks nor the guests/customers were adequately represented in the development of the system.  Managers, designers, and developers can be truly out of touch with the larger community.
  • The software was “sold” to the organization by a vendor or even an internal group, but the purchasers did not fully understand what they were getting — and the implications for everyday process and integration with other systems.
  • The bait-and-switch tactics of some vendors and consulting companies can also be a problem.  The “heavy hitters” show up only at the beginning of the engagement, but most of the system development is left in the hands of junior people who lack experience and knowhow.

Don’t buy what you already have.

Companies sometimes buy new and expensive systems, without realizing that they already own all or most of the capabilities they are seeking.  Operating systems and large packages often come bundled with all sorts of powerful tools.  The book gives the example of small firms buying expensive document management systems (which in some cases proved to be nothing more than a big headache) when they already owned Microsoft SharePoint as part of their Windows Server.

Today there are many open source products that are very capable and are free of charge.  Spend money on what you really need, not on what you don’t.  For businesses, the best software investments will usually involve focused attention on what is unique in their processes, rather than on some grand system.  If you need to make a cross-country trip, invest in a plane ticket before you start pricing deluxe motor homes.

Watch pricing in product lines and Software-as-a-Service.

The book covers several examples.  Price structures may make it likely you will incur upcharges more rapidly than you might think.  Perhaps you are a small company that buys the “small business” edition of a product, only to learn that there is a critical feature only available in the “enterprise” edition, which costs considerably more.

Another example was found in a Software-as-a-Service system (SaaS).  The product provides two major services and comes with a low entry price and seemingly a generous amount of online disk space.  However, most of that disk space is allocated to one service that is likely to remain relatively small; whereas the other service, where very rapid expansion would be anticipated, is allotted less than 5% of the total disk space.  Reasonable additional disk space is far more expensive than the initial subscription, and can never be reduced even if you archive off old data.

Sometimes it is very easy to sign up for SaaS software, but the price structure may be buried deep in a website.  When it comes to such purchases, it is caveat emptor.

Software product lines and SaaS can offer excellent value, and SaaS promises tremendous boons.  However, you must do your due diligence and check prices carefully.  Project your needs forward, and compare them to the price structure.  There may be price changes and increases, but if the structure appears sensible, given your current and anticipated future needs, that is a positive sign.  Once your business makes a commitment, even if it is only to an SaaS system, it is neither trivial nor inexpensive to make a change.

There is no substitute for good technology planning.

Good technology planning will help you find the best value in software for your organization.  A tech advisor may be able to help if you don’t have enough expertise in-house.  Regardless, be aware that the software vendors will not necessarily tell you all you need to know.  The “wow” demonstrations and the glitzy marketing materials may be misleading.  You need to learn the rest of the story.

Copyright © 2011-2012 Patrick D. Russell
Posted in ARCHIVE (Posts prior to 2013), Due Diligence, Software Price/Value, Tech Advisor/Advocate | Leave a comment

The Overlooked Power of Social Networking

What is Familiar …  

Social networking has been widely embraced in recent years, especially by the young.  Sites such as Facebook, Twitter, YouTube, Pinterest, and others, have become major powerhouses of the internet, accompanied by many thousands of blogs and personal websites.  Sometimes social networking is used to create or connect communities of friends, families, like-minded individuals, clubs, and organizations.  It can be a vehicle for sharing information and education.  It also provides a very public forum for self-expression, which can range from the ordinary to the odd or extreme.  Thoughts and reactions that formerly were private are now out there for anyone to read.  Average people can easily publish pictures, music, and videos.  The new smart phones make posting on the social media as easy as texting to a friend.  Low cost website hosting, which supports free, and relatively easy-to-use blog and content management software such as WordPress, has made it easy for individuals to create a web presence.  Somehow, through the magic of internet search engines, and people who spread the word in person or electronically, isolated posts may get discovered and “go viral,” without any formal efforts to publicize or advertise.  Of course the conventional media are watching and in turn will often publicize whatever is getting attention on the informal channels.

The Arab Spring showed the world how social media, texting, smart phones with built-in photo and video cameras, and worldwide telephony could be the communications backbone of a successful revolution.  One of the ways the besieged governments tried to hold onto power was to shut down the communications infrastructure.  Ultimately this worked against them, as it proved to the world the repressive nature of the regimes and fueled the disenchantment.  Meanwhile, the revolutionaries searched for and found cracks in the armor so they could get their communications out.

Despite the admitted power of social networking, there are many “old fashioned” people who are uncomfortable with it.  They look at typical posts and comments and see 99% trash talk and junk.  They don’t understand why anyone would prefer to text, typing on a Lilliputian keyboard, rather than making a phone call (except of course for texting under the table during a business meeting).  They may have heard that social media are essential for business networking, but then they dipped a toe into the waters and got limited or no results.  They may have heard stories about young people who posted raucous accounts of their college beer bashes, only to have those read by prospective employers, who may go so far as to demand their Facebook logins and passwords.  They may also be concerned about the predators who troll the internet for the innocent, the gullible, and the over-exposed.  These people see little positive value in social networking, and may dismiss it entirely or make a point of avoiding it.

These perspectives on social media are common knowledge for anyone following major trends.  However, there are other significant dynamics relating to social media that are not so obvious.  Here I would like to discuss one of those, and you may not have given it a lot of thought.

What You May Have Overlooked …

Despite all the positives and negatives, there is real power in social networking that many overlook.  The social media are great repositories of sentiment, and this sentiment can be mined and analyzed.  What’s more, because the social media are highly dynamic, changes in sentiment can be monitored with virtually no lag time.  We are seeing more and more evidence of this in the conventional media, where, for example, a number of television shows now report on the “pulse of the day” based on monitoring the social media.  As we know, government agencies are also watching, looking for “chatter” that might alert them to nefarious plots.

This is a matter of statistical significance rather than absolute accuracy.  Twitter includes a positive and negative sentiment detector in its programming interface.  Even if it is only 70% accurate it has analytical value.  For example, if a news story about a company or product surfaces, a volume spike in the chat (or lack thereof), and the direction of the sentiment, might give a company useful intelligence in terms of how it should respond.  A company can also use such studies to measure the effectiveness of advertising, and the interest in and reactions to various products and features.  Despite the margin of error, this information is available well before the sales results come in, and at very least provides another data point to correlate with those results.  Comparative studies among products, companies, political candidates, and popular entertainment are all possible.  There are indeed many practical uses for data mining in the social media.

What might make this especially attractive to businesses is they don’t need to use clandestine means to get this information.  No hacking is required, just good software programming.  Social media are public to a large extent.  Yes, there are private groups, and endlessly overlapping Venn Diagrams of “linked” individuals and entities, which seem to illustrate the “six degrees of separation.”  However, people who want to get their feelings and reactions out, especially about public events, companies, products, and high profile individuals, want to express themselves in a public forum.  Most large companies and other entities have presences on social sites, where any user can comment, and these act as a kind of sentiment magnet.  Twitter also has mechanisms that enable users to flag Tweets as relevant to a specific entity.  The sheer volume of social media content in the public forum often guarantees statistical significance, even when the analytical tools have a large margin of error.

The only downside is that the social media represent a skewed demographic, favoring the young with time on their hands or the disenfranchised.  Yet they also reflect those who are technologically savvy, and who are willing buyers of technology such as smart phones and trendy automobiles.  Many companies target their product offerings to this group.  Moreover, as time goes on, the social media are gaining in popularity and are being adopted by a broader demographic.  Companies are reaching out on their Facebook Pages and YouTube Channels to wider audiences, even if the tone remains youthful.

These public data can be retrieved and analyzed using specialized software.  Not only that, depending on the particular social medium, data fields are available to refine the intelligence, such as language, internet/hosting source, type of phone or application used, and geographic coordinates.  All in all, there is real power here, and organizations are starting to exploit it.

A new take on the Survey?

A highly respected colleague of mine specializes in the techniques for constructing effective surveys, which can serve as valid input to powerful predictive analytics.  The techniques are based on science, math, and statistics, as well as lessons from experience.  He has demonstrated the efficacy of this form of business intelligence as offering a substantial amplification to returns-on-investment for business initiatives.

He and I have been discussing the value of adding a social media data mining front end to this kind of business intelligence.  This would not replace formal surveys, which of course provide highly focused input – but rather would offer an additional input that has its own advantages.  First, social media provide “immediate” feedback.  Second, they don’t depend on volunteers willing to respond to a survey request.  Both approaches have benefits, and a future study of interest would be to compare differences in their dynamics, their relative strengths and weaknesses, and their potential synergies.

A key advantage of data mining in social media is that it can be automated – from the raw data collection all the way to a real-time dashboard of dials and graphs.  This can be done with relatively simplistic algorithms, and yet be informative.  If we take the automation further, artificial intelligence can be introduced and evolved so as to yield highly refined results.

Russell Kennedy Partners has developed software to gather comments and posts from Facebook and YouTube, and Tweets from Twitter, for the purposes of re-use in advertising and market analysis, and is currently in the process of building a more powerful analytics module which includes additional sources, designed for corporate clients in the entertainment, health, fitness, beauty, and professional services industries.

Copyright © 2011-2012  Patrick D. Russell
Posted in ARCHIVE (Posts prior to 2013), Social Networking | Leave a comment

Bill Swanson and Predictive Analytics

I met Bill Swanson in October, 2009, when he gave a presentation entitled Predictive Analytics at Confab — the annual international conference for the Institute of Management Consultants (IMC).  For me, his was the most exciting presentation at the conference.  Why?  Because it was something totally new (for me) and was highly compelling.  What’s more, the conclusions of Predictive Analytics often go against “common sense” and the “obvious” — as Bill demonstrated in his presentation, using specialized audience polling devices.

Predictive Analytics is a new way of making strategic business decisions.  Bill is a Management Consultant who specializes in Business Strategy.  However, his niche is quite unique.  I have had the privilege of meeting many fine management consultants as a Chapter President in IMC (San Diego Chapter).  However, Bill is the only one I know who specializes in Predictive Analytics.  When I met Bill, he had recently completed an engagement with a midsize company that had been in business for many years, but wanted to improve its performance and productivity.  After some initial high-level resistance, the company accepted Bill’s analysis, and went on to have its best years ever (during the Great Recession).

Predictive Analytics is relatively new, but is poised to transform the way strategic business planning is done.  It promises to elevate decision-making beyond speculation and guesswork.  It can be applied to both internal and external factors, but is especially useful in processing crucial data from outside the organization.    It can start with surveys expressly designed to retrieve the most useful data, within what is practical and acceptable, but it also factors in organizational essentials.  Mathematical analysis and advanced statistics, supported by specialized software, yield quantitative results predicting the outcomes of hypothetical strategies, and substantiate which ones would yield the best return on investment and under what circumstances.  The results are often surprising, and may differ markedly from conclusions based on a more simplistic analysis.  Nevertheless these findings repeatedly prove to be accurate, and are now challenging the proverbial “gut instincts” cherished by many executives.  The new analytic methodologies are being used more and more by forward-thinking organizations.

Kennedy Information Inc., a watchdog for the management consulting industry, held a conference for the IMC in Chicago in May, 2010.  Tom Rodenhauser led the discussion.  He predicted that Analytics would change the face of strategy consulting, making some aspects of it a technology-empowered commodity.  At the May 2011 conference, Kennedy Information underscored its prediction, and indicated that Analytics was becoming more widely accepted, as business leaders in tough economic times were looking for more objective validation of big decisions.

Bill is both a valued colleague and a friend.  He presents to national and international management and strategy conferences several times a year.  I invite you to visit the excellent website for his consulting firm, CEO Decisions.

… so Bill concentrates on Strategy and Predictive Analytics, and my sphere is Software Technology and the Tech Advisor — but where do these two specialties meet?  There are a number of points of intersection.  The most obvious one involves the kinds of decisions faced by midsize and large organizations.  They are looking for how to deploy what may amount to many millions of dollars, and are seeking optimal results and strong ROIs.  Many such decisions will imply new, or changed, business processes, which in turn are likely to require new business/enterprise software systems, or big changes in existing ones.  The Tech Advisor is first and foremost about the planning stages of major systems.  The analytics drives the high level, strategic directions.  However, the Tech Advisor can flesh out the technical implications and alternatives relevant to the choices under consideration in the analytics.  This makes for a powerful give-and-take that brings the alternatives into laser focus — for better decisions and better technologies.

However, there are other key synergies between Predictive Analytics and Software Technology as well. Process automation, or workflow, is the source of organizational data which can feed into the analytics — if those data are properly organized and instrumented.  In addition, workflow data can be used for performance monitoring and fine tuning as a strategic initiative gets underway.  We will discuss these connections further in future blog posts.

Copyright © 2011 Patrick D. Russell

Posted in ARCHIVE (Posts prior to 2013), Business Intelligence, Profiles, Tech Advisor/Advocate | Leave a comment

Green IT: Why it is so important.

Updated July 9, 2012

There has been a lot of talk in recent years about Green IT.  People often don’t understand it, or think of it in a piecemeal way – in terms of a few energy-saving products they may have heard about.  Many companies are jumping on the environmentalist bandwagon, hoping to capitalize on the Sustainability movement.  What is Green IT really all about, and is it really important, compared with other environmental initiatives?

Moore’s Law dates back to Intel co-founder Gordon E. Moore’s paper in 1965, where he observed a doubling of the componentry in integrated circuits every year since their invention in 1958, and predicted that the trend would continue.  Later “Moore’s Law” has been cited as a doubling of chip performance every 18 months (according to Intel executive David House) or every 2 years – which have proven to be consistent approximations over many years.  Thus we have come to expect increasing efficiencies in computing.  The typical desktop PC of today uses about the same sized power supply as it did 1995, indicating that power consumption has remained relatively flat as computer power has multiplied, due to continuing miniaturization.  Doesn’t this provide an opportunity for computer companies to “green wash” normal and expected improvements in efficiency as energy-saving breakthroughs?  When the oil and coal companies are touting how “green” they are, you have to ask yourself, what does green really mean?

Green IT can also be a confusing concept because we already think of computer-based systems as paragons of hyper-speed and hyper-efficiency.  There is no simple formula for how much “work” computers can do in terms of manual, physical, or intellectual labor, but the gains that have been achieved are obvious.  What’s more, there are many secondary environmental gains.  More people can work at home, reducing auto emissions.  Business people don’t need to travel as often or as far.  Manufacturing, shipping, transportation systems have been made faster and more energy-efficient thanks to computer-based logistics and automation.  The list goes on.  In light of all this, aren’t we just gilding the lily with the idea of Green IT?  Isn’t computer energy usage relatively inconsequential given all these benefits?

The Data Center Challenge

Yet Green IT is real – and it might be argued that it is of strategic importance in terms of worldwide energy policy and economic development.  By some reports, roughly 8% of all electric power use in the United States is for data centers.  Stated simply, the problem is that computers consume a lot of power (electricity), and they generate a lot of heat and must be cooled in order to function properly.  The electricity consumed for cooling actually comprises most of the energy costs for the average data center.  With the continuing rise of the internet, and the burgeoning demand for online shopping, Voice-over-IP, smart phones, video over IP, high speed graphics, and the growing usage of “the cloud,” the demand for data center capabilities is expected to rise rapidly.  What’s more, the peak demand periods that plague utility companies overlap times of heavy data usage.  Thus the greening of data centers can have a huge positive impact both on greenhouse gas emissions and on the secure operation of the power grid itself.

Recently our team worked with a midsize data center complex.  Out of the roughly 75,000 square feet of total floor space, the server room only took up 6,600 square feet.  The total electric bill for a recent year was about $535,000, and out of that approximately $375,000 was spent just on the server complex (running the hardware and cooling).  Thus more than two-thirds of this whopping bill went for less than a tenth of the floor space.  These numbers may convey to the average person the magnitude of the problem.  To make the issue even more convincing, I should point out that this data center was located in a temperate climate where it rarely reached 80 degrees F., was well managed and up-to-date in all its equipment, and had already deployed a number of state-of-the-art energy-saving strategies in software and computer hardware.

Big Data

The impetus for Green IT will soon reach critical levels due to the phenomenon we now refer to as “big data.”  I have heard various figures for the rate at which data are increasing, and they are staggering – one statistic cites a tenfold increase every five years.  Data are generated at faster and faster rates, and storage capacities are increasing, though not fast enough.  A 2010 article in The Economist, Data, data everywhere, provides some fascinating examples, and outlines a number of the issues.  Don’t think of big data as a lot of old files sent off to a dusty storeroom somewhere.  Think of them more like Google does – something to be indexed, searched, analyzed, and generally mined for value.  The sheer vastness of big data makes all this Google-like processing far more of a challenge.  Consider the difference between looking for a needle in a 1 cubic meter haystack versus a 10 cubic meter haystack.  The implications for Green IT should be clear:  big data not only imply more information to store, but also faster and more dynamic processing of those data.  As data increases tenfold, it would be unacceptable if searches and retrievals ran 10 times slower.  The storage itself increases power use, but the exhaustive processing also makes heavy demands.  (The successful management of big data is a separate, but related topic.)

While I am not offering detailed projections, it should be clear that a 10x increase in data and data processing within a five-year period would simply swamp the power grid without increases in energy efficiency.  Moore’s Law-type improvements now can be seen not just as fortuitous advances, but as the only thing keeping our society from being eaten by the giant data monster.  Recently, however, various scientists have expressed concern that Moore’s Law may start to fail, as the current silicon-based technologies are reaching limits at the atomic level and due to the laws of thermodynamics.  (For example, see The Collapse of Moore’s Law: Physicist Says It’s Already Happening by Matt Peckham in Time Techland.)  What’s more, the degree of difficulty in designing and fabricating denser megachips is increasing exponentially, demanding more intensive capital flows.  Breakthrough technologies may come to the rescue, but the timeline for their arrival is uncertain.  If you take Moore’s Law as a doubling every 2 years (rather than every 18 months), as some experts have stated it, we are already running behind, as chip power will only increase by 8x in 6 years, whereas data will increase 10x in only 5 years.  (In ten years this would imply a 100x increase in data, versus an increase in processing power of only 32x.  Interestingly, if we use a constant 18-month doubling for Moore’s Law, instead of 2 years, we stay about even with our 10x in 5 year data expansion.)  Granted, these are rough guidelines.  Even so, they should give us pause.

More than a Billion PCs

While data centers and the rise of big data are major challenges at the system and server level, Green IT has further challenges outside the data centers.  The number of personal computers in use in the world reached one billion at the end of 2008, with a prediction of two billion by 2015.  (See the article in Worldometers.)  This is a mix of desktops and laptops, and represents a substantial amount of electrical power usage (which includes a cooling load in office buildings in hot weather or climates).  If you walked into a big office floor space in the 1990s, it was very typical to see desktop computers and monitors running even when many of the people were elsewhere (not to mention printers and copiers).  Typically workers left their machines on 24/7 as well.  This world is changing.

The energy costs for workstations and personal computers are enormous, and often represent substantial expenses for businesses.  These costs aren’t even factored into our data center numbers, which means that the overall IT energy problem is far larger.  The energy costs for running a couple of computers in a small office may only be perceived as a minor issue to the small business owner, and cooling may not be a problem at all, as it is in a data center where so much processing power is concentrated.  In a large corporate office space, these issues do get attention as bottom-line items.  We should also factor in smart phones, tablets, and other devices.  Will the gradual shift from old-fashioned desktops to lower power devices like laptops, tablets, and phones, tend to reduce energy consumption?  … or will their very popularity, and the fact that many people are now using these devices to run intense games and applications and stream entire movies, significantly increase the load on data centers?

The overall picture is complex.  However, it certainly has gotten the attention of the utility companies in locales where they are struggling with peak demand issues, environmental mandates, and record heat waves.  Some utilities have been subsidizing businesses to replace older desktop PCs and servers with new models equipped with 80 PLUS certified power supplies.  These are at least 80% efficient, compared with the typical 60-70% percent, and reduce computer energy consumption by 15-25%.  The fact that the utilities are stepping up here is an indication that PC power consumption is an important target in their efforts to manage burgeoning power consumption.

A New Definition of Green IT

Given all this, we can really begin to see the importance of Green IT.  For purposes here, we can define Green IT as the technologies and approaches that give us energy savings, and reduced environmental damage, over and above those gains provided by Moore’s Law (the chipmakers).  We might say that the core business of the chipmakers (like Intel) is to make more powerful processors with relatively flat energy usage and also a relatively flat cost.  In contrast, the goals of Green IT include:

  • Reducing energy (electricity) usage and costs involved in running the hardware, including cooling.
  • Computer hardware designs for lower power consumption.
  • Software and hardware architectures that maximize the utilization of the computers in operation.
  • Reducing demands on the electricity grid during peak periods.
  • Reducing the overall carbon footprint.
  • Reducing various types of damage to the environment.

There is of course no hard line between Moore’s Law progress and Green IT.  As awareness of Sustainability and Green IT has increased, the chip people have been moving in that direction, and the Green IT people are looking to the chip people to bring them those innovations.  Nevertheless, I think that this distinction gives a good idea of what we are looking for from Green IT.  It will be Green IT, coupled with the benefits of Moore’s Law, which will keep us ahead of the curve on energy use.  In fact, Green IT has the potential for delivering a big leap forward.

Green IT Solutions

Within the last few years a number of technologies have emerged that promise drastic improvements in data center and computer energy demand.  A few of these are already becoming widely accepted.  The data center problem is being attacked on a number of fronts, and the best results are being achieved by combining solutions.

I will briefly mention some approaches in various areas.

Computer Hardware

  • High Efficiency Servers.  SeaMicro claims it servers can run the same software on ¼ the power and take up 1/6 the space as more conventional designs.
  • Vitualization and Server Pooling.  VMWare, a leader in the field claims up to 80% energy savings by running more “servers” on far fewer physical computers.
  • Solid State Drives.  Very expensive now, but costs are coming down.  They provide more operations/transactions per watt.
  • Green Workstations.  There are many things going on to improve upon the old fleet of desktops and monitors running 24/7 on the office floor.  We already mentioned more efficient power supplies.  Tied in with cloud and virtualization advancements in the data center, minimalist workstations are replacing full-sized desktops in the office.  Newer equipment generally runs more efficiently, including as the new Energy Star LED monitors.

Software

  • Software-as-a-Service / Multi-Tenancy.  In the cloud, many separate business/corporate software accounts can be run on a single computer.  Process automation software (for example, customer relationship management) is a key productivity factor in most organizations.  It is extensive in that it serves many users in an organization, but it is generally not processor intensive.  A busy server may consume more energy than a relatively quiescent one.  However, if it can process many accounts simultaneously, the net energy savings can be substantial.  SalesForce.com claims that energy costs per transaction are reduced by orders of magnitude by running business software in the cloud!

Advanced Cooling

  • Advanced HVAC.  Coolerado’s revolutionary designs have documented data center energy savings in the 90% range – truly astounding results!
  • Ice Systems.  Ice Energy takes advantage of low off-peak electric rates to freeze water during the night, and the ice provides cooling during the day.
  • Environmental Cooling.  Simple, but effective:  outside air is pumped in during cold weather.  Geothermal cooling is another highly efficient technology in regions where it is practical.
  • Advanced airflow design.  This is often a part of other advanced cooling systems, and enables them to achieve their full potential.

Advanced Insulation

  • Radiant Barriers.  Innovative Insulation is a lead provider of this stupendously effective, yet inexpensive form of insulation based on space blanket technology.  I have spoken with engineers and technicians who claim to have achieved significant reductions in cooling costs in data centers and other structures using radiant barriers.

Alternative Energy

  • Solar, Methane, etc.  Alternative energy sources are often very effective when combined with other technologies that drastically reduce power consumption.  For example, some companies are becoming the driving financial forces behind methane harvesting at landfills, and they use the methane to lower their electricity costs significantly (and remove a dangerous greenhouse gas from the environment).
  • Microturbines, Fuel Cells, etc.  Local energy production can be advantageous (Bloom Energy is a well-known player in this field.)
  • Grid Management.  Data centers can partner with utility companies to bring in technologies to help balance the grid.  The utilities may be willing to assume some of the costs of these projects.

Battery and Energy Storage Improvements

  • Lithium Air (and other Energy Storage advancements).  Battery technology may have more of impact on electric vehicles than on Green IT, but the field bears watching.  Large battery backup systems are used in data centers along with emergency generators.  Lithium air batteries are several years away, but may offer ten times the storage density compared with lithium ion.  This increased density may actually allow data centers to use batteries for energy storage to take advantage of off-peak rates (like the ice systems), lowering costs and reducing strains on the power grid.  (Now battery systems are mostly used for very short-term smoothing between the moment grid power is lost and the time the emergency diesel generator kicks in.)
  • Cleaner battery chemistry.  Cleaner and more economical raw materials for large battery arrays may or may not directly affect energy costs, but they will lead to a cleaner environment.

These and other technologies not only show great promise.  They will also prove to be essential as we move into the world of big data and greater energy challenges.  However, the best results can only be achieved based on skillful and careful analysis.  Simply retrofitting in a green technology may yield good results, or it may not.  Combination solutions are almost always optimal when the costs and benefits are factored in.  For example, solar power may be a good choice for a data center, but the cost of replacing most or all electricity by solar would probably be prohibitively expensive, and a very poor use of resources.  However, solar combined with a revolutionary cooling system and optimized air flow, and advanced computer hardware and software, may be an excellent approach.

A final note:  Besides the tremendous benefits to the environment and society that Green IT improvements offer, they can also have a significant effect on an organization’s bottom line.  They can be outstanding investments, and a variety of financing options may be available.

Russell Kennedy Partners is proud to partner with a team of experts in various areas of data center energy conservation.

DISCLAIMER:  Russell Kennedy Partners has no financial affiliation with any of companies discussed in this article at this time, nor are there any promotional considerations.  We are not endorsing any of the products mentioned, nor do we wish to imply that any of them are superior to their competitors.  We do not vouch for any of the claims mentioned here or elsewhere.  The products have been cited based on our own limited research and our acquaintance with engineers who work with some of the products in question.  These and other similar products are complex, and will only yield optimal results when deployed by competent engineers and specialists.  The reader is urged to her or his own research.

Copyright © 2011-2012 Patrick D. Russell

 

Posted in ARCHIVE (Posts prior to 2013), Green IT, Sustainability | Leave a comment

System Questions for Due Diligence

When you are planning an important software system, even in the early stages — and especially in the stages before you talk with consultants and vendors, you should ask some serious questions about what you are trying to do.  Do you really have a good vision of the endpoint of your endeavors, what the system will do, what it will look like, and how all the pieces will fit together?

Perhaps the most important facet of your due diligence is the clarity and coherence of your vision.  Now, this doesn’t have to be thought through 100% ahead of time.  The vision can evolve, but it is important that there be a kernel that grows and becomes more cogent as the planning develops.

As you progress through the planning stages there are a number of questions you should answer as part of the evolution of your vision and your system due diligence.  Here are some of those questions, broken into the general categories of Planning, Technology, Quality, and System Viability & Longevity:

Planning

Productive vs. Wasteful Planning Are your preliminary requirements gathering and planning sessions correctly focused, or are you wasting time?
Complete Project Scoping Have you really identified all the tasks required, their costs, and business impacts?
Community Knowledge Do the managers, planners, and designers really know what is going on?  Can they really speak for the needs of all who will be affected?
Collateral Issues Planning What about system configuration, training, knowledge transfer, maintenance, and future enhancements?  These collateral issues and extended planning concerns deserve attention up front, even if they are not fully resolved.

Technology

Right Technology for the Job Is the technology being considered really the best choice?  Have you really examined the alternatives?
Features, Capabilities, Scalability Are you getting the features, capabilities, and scalability you need now and will need in the future?
Security and Change Control Security and change control.  How have these been addressed?  What are the risks in the cloud and software-as-a-service?
Data Migration, Synchronization Loading, migrating, and connecting with data:  This challenge is easily underweighted, sometimes by a significant amount.  Have you really factored in all you will be up against here?
Legacy System Risks Interfacing with legacy systems.  Is the original knowledge and documentation still in-house, or are you facing the task of deciphering a lot of cryptic pieces?

Quality

Works for All Users, Customers Will the system serve all the users?  How about the customers?  Who will make sure?
“Industrial Strength” How will you know you are getting high quality, “industrial strength” custom work?  Glitzy websites and brochures, and touting special methodologies and best practices are not always indicative of quality.
Risks of Upgrades Will your staff and your business be downgraded by a software upgrade?

System Viability & Longevity

“Own” the System Will your organization really “own” the system, or will you remain dependent on the vendor?  What are your organizational goals for maintaining the system in-house or outsourcing?
Vendor, Platform Risks What happens if your vendor goes out of business or the platform you bought is discontinued?  Can you build “defensively”?
System Longevity Will your new system last for years, even decades?  A premium price tag won’t guarantee it.

Of course these are not the only issues that will demand attention in a software project, but this list is a starting point.

Please contact us with any of your ideas about due diligence for major software projects.

Copyright © 2011 Patrick D. Russell

Posted in ARCHIVE (Posts prior to 2013), Due Diligence | Leave a comment

The Due Diligence Checklist (Part 4)

Backed Into a Corner?  Think Again…

Sometimes managers and business leaders feel like they have been backed into a corner.  Things are not the way they like, but they don’t believe they can do anything about it – perhaps due to time, money, or the sheer overwhelming nature of the task.  However, there may be an expert out there who can surprise you with a straightforward solution that you had no idea was even possible.  People often don’t realize that many chronic problems have simple solutions.  If nothing else, an expert or a tech advisor can give you some structure and guidelines for what to do when you can afford a solution.

If all your choices seem bad, cast a bigger net; go to a bigger wheel.  You may just be looking at the situation with blinders on.

Look for Major Trends That May Affect Your Strategies.

One common mistake we make is to assume that the future will be pretty much like the past has been.  We have seen how wrong this can be in the recent financial crisis.  Greenspan’s impregnable models just didn’t go back far enough, so he made one wrong choice after another.  American car companies overlooked the fact that women entered the workforce en masse in the 1980s.  Japan seized the opportunity by building cars that appealed to the working woman.  Today our middle class has been devastated, and many young people are living disenfranchised, marginal existences – perhaps with college degrees, lots of school loans, and jobs as department store clerks.

The world has changed and continues to change, and emerging trends may affect your business – tomorrow or five years from now.  We don’t have a crystal ball, but we can figure out some of these trends.  In the book, The Tech Advisor, we look at some examples of major technology shifts that caught many unprepared, even though industry  insiders took them as a given.  Sometimes an expert advisor can help bridge this gap, and help business leaders get ahead of the curve.  Research studies are another valuable source of information.

The Golden Rule:  Don’t Confuse Strategy and Tactics.

Strategy versus tactics is part of a much larger discussion, where there are numerous perspectives and, without a doubt, agendas.  Therefore it will only be mentioned here.  Generally the idea is thought to have originated as a part of military planning, but it has been adopted in the business world.  Strategy involves long-term goals, vision, mission, and identity (“Who are we?”  “Toys R Us”), and usually represents long and careful planning.  Tactics are methods and means for pursuing a strategy, usually shorter term activities, often combined with other tactics, and easily replaced when better ones are found.  However, when a tactic is pursued without a strategy or instead of developing a strategy, it might be a knee-jerk reaction that is less likely to be productive.

During the .Com bubble in the late 1990s, many companies went into merger and acquisition mode.  In some cases there was a confusion of tactics for strategy.  There were always reasons given, such as business synergies, but often the main reasons were:  Wall Street will like it; this will allow us to go public; the stock price will go up; and I’ll get rich.  These unfortunately were not tantamount to full-blown strategies that could unify companies.  The planning consisted of, buy first, and figure out how to integrate later.  While this kind of planning, or lack thereof, affected all layers of the respective corporate cultures, it was often very obvious in the difficulties encountered with the inconsistent computer and software systems that somehow had to be merged.  With good planning, there could be a smooth, orderly transition.  Otherwise the result was chaos.

 Copyright © 2011 Patrick D. Russell
Posted in ARCHIVE (Posts prior to 2013), Due Diligence | Leave a comment

The Due Diligence Checklist (Part 3)

Don’t Overestimate the Flow of Information Up to Management.

We know what rolls downhill, but in organizations information does not roll uphill easily.  As a business leader, there may be many things you don’t see and are not even aware of in your organization.  These factors will be missing in your plans and calculations.  One of the purposes of bringing in a tech advisor is to bring these things to light.  (Workflow software that includes management rollups can help bring information through and organize it for analysis.)

When organizations grow they tend to build more layers of bureaucracy.  The hierarchical structure may be justified for business reasons, but it does tend to compartmentalize and filter information.  People often don’t like to report problems to their superiors, and those who do may be branded as complainers.  Some simply don’t know how to bring their reports forward in a constructive way.

Very recently I watched part of an episode of Undercover Boss in which the chief executive of a huge hotel complex spent time behind the check-in desk.  The company had recently “upgraded” their computer system.  Unlike the old system, which workers liked, the new system was difficult to use (too many windows and clicks), very slow to process, often froze up, and would spit credit cards out onto the floor if the clerks weren’t there to catch them at just the right time.  It caused a lot of frustration among the employees, and at busy times the guests ended up waiting in long lines because it took so long to process each party.  Prior to his undercover stint, the executive had no idea that this was going on.  He knew the computer system had been recently upgraded, but never got any feedback from the primary users or the guests.  Armed with the information, he arranged to correct the problems.

Don’t Overestimate Your Internal Confidentiality;  Don’t Underestimate How Quickly Word Spreads.

This is the flipside of the last item.  Scuttlebutt comes from many sources and can go viral, and when it is negative it can have a devastating effect.  Just because the chief executive at the hotel complex hadn’t known about his bad computer system didn’t mean that his customers didn’t know.

Even information held very confidential can be deduced or figured out.  Analysts will interpret publically known events.  (If a CFO unexpectedly leaves with no explanation, this causes worry and speculation among investors.)  Spouses may read each other’s moods and expressions and figure out that something very good or bad is going on.  The same goes for families, friends, and acquaintances.

Don’t Forget about the “Silent Majority” – Your Customers

Customers may be your end-users – for example, if you have a web presence for retail sales – in which case it behooves you to make sure that their experiences are easy, reliable, and satisfactory.  However, even when your customers don’t interact directly with your system, it can have a very powerful impact on their experiences and how they look upon your business.  The hotel guests at the front desk with the undercover boss are a good example.

The Due Diligence Checklist to Be Continued in Part 4.

Copyright © 2011 Patrick D. Russell
Posted in ARCHIVE (Posts prior to 2013), Due Diligence | Leave a comment

The Due Diligence Checklist (Part 2)

Don’t Design By Consensus.

Building consensus in a project is important because it gets people on the same page and ensures that they will be ready to participate and contribute.  Recalcitrant individuals and groups can damage a project.  This can be prevented to a large extent by including representatives from all quarters, and factoring in their needs and concerns.

However, if you have to wait for consensus to make your major design and architecture decisions, you may be waiting a long time.  Project experts and mangers can do the most by exercising leadership.  Most participants want and need to be heard and have an input on issues relevant to them, but they are not qualified system designers, and won’t commit on major decisions because they don’t understand all the implications.  Design by consensus is not an efficient or effective way to build a system.

Know What You Don’t Know – and Get Answers.

Knowing what you don’t know is often more important than knowing what you do know.  Sometimes people will admit “I am no expert,” and yet insist on pushing for a particular solution, often against the advice of experts.  The experts are not always right.  People may have a true instinct or a personal vision, which trumps all the experts.  However, other factors and motives may be behind someone who insists on an approach, such as stubbornness, narcissism, cronyism, or nepotism.  People may simply be unaware of their limitations, and they just try to make the best of things they don’t understand very well.

It is important to do a self-assessment on one’s knowledge level, and to seek help and support in areas where one falls short.  As Benjamin Franklin’s maxim teaches us, “A stitch in time saves nine.”  Solicit and take good advice.

Form a Complete Picture of the Challenge.When planning a project, make sure to consider and weigh as best you can all the elements involved, including integration with other systems, data population and connections, risks associated with unknown factors, quality documentation, test and correction procedures, system benchmarks and audits, training, knowledge transfer, maintenance, and upgrades.  These elements may seem daunting, even overwhelming, when the project is already large in scope.  However, if they are included in the planning, they can generally be handled smoothly in the course of events.  Otherwise elements not factored in ahead of time may prove disruptive.

See Blog Entry:  Due Diligence System Questions for more information on this topic.

The Due Diligence Checklist to Be Continued in Part 3.

Copyright © 2011 Patrick D. Russell
Posted in ARCHIVE (Posts prior to 2013), Due Diligence | Leave a comment

The Due Diligence Checklist for Software Projects (Part 1)

Tech Advisors are first and foremost about due diligence.  They can support organizations in doing their due diligence at several different points in a software project.  Here is a checklist to help organizations do their own due diligence.  The detailed explanations will be covered in other posts.

Make Sure You Understand Pricing.

Watch out for disguised up-charges and price structures designed to trip the buyer into a higher bracket or overage charges quickly.

Make sure that you understand the full costs of a project, and are not leaving out some of the essential pieces in your calculations.

Get Multiple Perspectives for System Design.

A great software system will be built to serve all the users.  Be careful not to limit your intelligence to one-pointed managers or “domain experts.”  In any organization, as in the army, there are many more privates than lieutenants, and many more lieutenants than generals.  Even if a highly paid manager is “more important” than a subordinate worker, when you do the math on the number of workers versus managers, you can see the importance of making a system that serves all.

Honor Your Soft Assets, Your Human Capital.

Different individuals contribute in their own ways, for example, through efforts and diligence, ideas and creativity, knowledge and experience, leadership and consensus-building, specialized skills and competencies, connections and affiliations, and a willingness to offer feedback.  The goal is to bring out the greatness in the whole.  This means encouraging people to use their unique abilities in support of the overall effort, while minimizing the complaining and dissent that hinders progress.

Don’t Overestimate the Usefulness of Planning Sessions.

Planning sessions may waste valuable time and resources when they lack good structure and leadership, and when they are not guided in the right direction.  They are great for building initial consensus and generating high level goals.  However, when they try to formalize or do detailed specifications without the benefit of expertise, much of the work may have to be redone and reinterpreted at a later stage.

The Due Diligence Checklist to Be Continued in Part 2.

Copyright © 2011 Patrick D. Russell
Posted in ARCHIVE (Posts prior to 2013), Due Diligence | Leave a comment