Tag Archives: Cloud

Let’s Get Busy

The adoption of cloud computing for mid-market organisations has not been as fast and as wholesale as the early hype of five years ago suggested it would be.

The 2014 study by Deloitte “Technology in the mid-market Perspectives and priorities” found that while there was a genuine enthusiasm for the operational enhancements that cloud-based solutions offer, full adoption of services in the cloud is still a work in progress for many companies. At that time only 14 percent of mid-market executives surveyed place these deployments in the “mature, successful” stage.
For many mid-market organisations the adoption of infrastructure cloud fundamentally falls into two camps: moving individual workloads to a cloud platform to pursue more isolated benefits, or; adopting the full service cloud strategy as a means to accessing more extreme economies of scale through wholesale business transformation.
With the latter approach proving the greater challenge the mid-market cloud activity has typically been consumed with token cloud workload strategies that have not delivered the full business benefits on offer.

Some of the typical challenges faced include:-

  • Existing Asset lifecycle – The organisation may be mid-stream in the asset lifecycle and therefore less willing or able to decommission a heavy investment in existing on-premise ICT infrastructure as a result of a wholesale shift to cloud.
  • Privacy and security
    The study also found with 34 percent of organisations were still facing challenges over internal concerns with data privacy and security.
  • Data integrity – More than one-third of respondents in the study believe they might not be able to ensure data integrity and reliability once the information is pushed up to the cloud but the study found that this concern was also waning as the cloud market matures and the offerings improve.
  • Interoperability
    More than one in five in the study cites the complexity of integrating information in the cloud with other core business systems as the greatest challenge in deploying or using cloud-based services.
  • Integration Half of the respondents acknowledge that in adopting cloud based services; they might face issues when integrating proprietary systems with their external stakeholders — from business partners to customers. A similar percentage cites concerns over migration to the cloud itself. Similarly, respondents in the study continue to be concerned about whether cloud solutions can easily integrate across business systems, particularly when considering a suite of solutions to handle various business functions
  • Globalisation – One thing that is not always top of mind is how does the cloud services support the mid-market in scenarios where globalisation continues to play an important role in business strategy. The reason for saying this is that two-thirds of the mid-market respondents in the study said that international orientation is important for their companies. This finding is consistent with another recent Deloitte survey, in which 66 percent of mid-market executives said their companies had some level of global revenue.
    While one might think that cloud is the perfect answer for supporting some global presence there are still challenges around connectivity, latency, sovereignty etc. that need to be considered, not to mention how the global presence is supported in these locations.

Time to get busy…

The low hanging fruit for mid-market organisations is to take on a SaaS solution that delivers immediate benefits such as dealing with a long term legacy pain-points.

Organisations who have been slow to adopt cloud may have been dipping their toes into this area but now they need to get a little more busy with building up capability with procuring and managing SaaS cloud deployments.

SaaS promises ease of provisioning and configuration, without the need for in-house technical expertise and another key benefit is the ability to scale the application up or down with the demands of the end-user base.

The SaaS landscape is now an arena of countless specialised solutions that are quick to deploy and bring immediate business functionality. Just one example I discovered this month, of how specialised solutions can emerge is the Deloitte Private Connect solution which is a cloud-based offering that combines shared ledger accounting with automated bookkeeping and benchmarking and an online portal and dashboard – this service is rapidly being taken up as a no-brainer by many of the clients.

Just one of many examples of why its time to get busy!

Using AWS leads to cloud economics

Cloud has a simple promise of:  “you only pay for what you use” – Right?

Sounds simple enough and while this is fundamentally true it is not always the case and there are some complexities that need to be considered in order to understand the true cost of the cloud equation for your business.

Some customers of AWS complain to me that their cloud experience has not been as cheap as they were expecting and the simple answer is that they had the wrong expectations to start with.

To their credit AWS provide a lot of transparency on their pricing and they have all the facts and calculators available that you need in order to explore the various scenarios and how it impacts the price you pay. However, don’t be lulled into a false sense of security on this, cloud economics can be quite complex and even the calculators are difficult to follow at times. Trust me, I have ploughed through a number of very complex excel spreadsheets to work out the TCO and the “you only pay for what you use” actually comes in a wider range of variables that all need to be considered.

AWS Pricing Philosophy

To illustrate AWS have a number of core pricing constructs that you can follow:

Pay as you go

• No minimum commitments or long-term contracts required

• Capex -> Opex

• Turn off when you don’t need it

Pay less per unit when you use more

• Tiered Pricing and Volume Discounts

Pay even less when you reserve

• Reserved pricing

Pay even less as AWS grows

• Efficiencies, optimisations and economies of scale result in passing the savings back to you in the form of lower pricing

Custom Pricing

AWS Pricing Fundamentals

To add further complexity the pricing is applied across these fundamental components:

Compute (Instances)

  • Clock hours of server time
  • Machine configuration (instance type)
  • Purchase type (On- Demand, Reserved, Spot)
  • Operating systems and software packages

Block Storage

  • Additional storage, backups, data transfer

Load balancing

  • Data Processing

Detailed Monitoring

Elastic IP addresses

Data Transfer

  • Regional Data Transfer
  • Data Transfer out

 

Traps for young players

Some of the things I have come across may be beneficial to others and they further highlight the complexities that can be involved in optimising the cloud arrangements:

  • Unlike VPS or even dedicated servers the AWS instances are not designed to be persistent but they are designed to be pieces of a larger network infrastructure that makes up your application. There are certain strategies that can be applied here such as having regular snapshots, and using an autoscaling group of 1 for instant failover.
  • It may not be a good idea in some scenarios to use non EBS optimised instances with provisioned IOPS as this can impact the pricing of your model.
  • It may not be advisable to reserve instances for long periods because these are highly specific and they should meet the following criteria in order to attract discounts:
    • same instance type
    • same region
    • same zone
  • With bandwidth costs (often the forgotten component) there is not much that can be done to reduce costs, however there are tools and third-party products available that increase the performance which can be very helpful to the equation
  • Another idea is to utilise the S3 cloud for your static media as this can have the effect of reducing the load on your server.
  • Sometimes it’s a good idea to provision multiple EC2 instances using auto-scaling that enable the environment to scale as demand rises. This has the effect of attracting lower costs outside of peak hours and for your minimum instance count you might consider using the AWS reserved instances as this may reduce your costs further

(seek your own cloud economics advice that considers your own particular circumstances- if your cloud pain persists see your cloud advisor!)

Design for failure and nothing will fail

I close with the Amazon’s EC2 catch phrase: “Design for failure and nothing will fail” and echo the sentiment that even with cloud a considerable amount of planning, strategy and architecture needs to be considered (cloud has changed nothing in some respects).

AWS provide a lot of information and support in this regard and here is an example whitepaper on cloud best practices:

https://media.amazonwebservices.com/AWS_Cloud_Best_Practices.pdf

The sooner that cloud disappears, the better

Jack Dorsey said it best..

“I think the best technologies disappear, they fade into the background, and they’re relevant when you want to use them, and they get out of the way when you don’t.

It would be fair to say that Jack knows quite a bit about how to build success with technology, he is the co-founder and chairman of Twitter and also founder and CEO of Square, both of these companies appear on the Fast Company’s list of the world’s top 50 most innovative companies. But why would Jack suggest that the best technology will disappear? Why didn’t Jack say that the best technology is found in the social networks or the very application services that have made him so successful today? Perhaps Jack knows that the answer to these questions sits within the lessons learned from history and in particular the birth of the internet.

That famous eureka moment

On the 29th of October 1969, Charley Kline, a UCLA graduate student, sat in UCLA lab and typed the first message on the ARPANET which was the progenitor of what was to become the global internet. The team at UCLA were able to use their computer to log in to a different computer that sat in a lab at the Stanford Research Institute, some 400 miles away connected by a cable. This was a massive breakthrough as it proved the theoretical feasibility that different types of computers could be made to communicate using packets rather than circuits – a major step along the path towards computer networking. The computer scientists had developed a packet switching protocol called NCP which would allow the networking of different kinds of hardware and software. Eventually NCP would be replaced by TCP/IP in 1983 and soon after that would come the birth of the modern-day internet. The breakthrough of 1969 seemed to validate the earlier predictions made by computer scientists:

“In a few years, men will be able to communicate more effectively through a machine than face to face, that is rather a startling thing to say, but it is our conclusion.“ J.C.R. Licklider and Robert S. Taylor, 1968, The Computer As A Communications Device

It seems reasonable to envision, for a time 10 or 15 years hence, a ‘thinking center’ that will incorporate the functions of present-day libraries together with anticipated advances in information storage and retrieval.” J.C.R. Licklider, Man-Computer Symbiosis, 1960.

If computers of the kind I have advocated become the computers of the future, then computing may someday be organized as a public utility just as the telephone system is a public utility. The computer utility could become the basis of a new and important industry.” John McCarthy, in a speech delivered at the MIT Centennial, 1961.

The early hyperbole

The breakthrough in computer networking triggered a lot of excitement and fuelled a wide spread belief that in just a few short years incredible disruptive changes would be seen across modern society. This new ability for different machines to talk to each over long distances gave birth to a string of bold predictions, such as an impending explosion of computing networks linking society with new artificial intelligence, even resulting in robots in the home doing the housework. Workers from the seventies have told of newsreels linking the technology breakthrough with shorter hours and less work as much of the daily office grind would be taken over by computers. These newsreels showed people at the beach and on picnics while the computer tape-reels were spinning to reduce the need for humans to work five days a week.

The hype played to the imaginations of the populous, it certainly fuelled the imagination of those creative minds who wrote television shows like the “Jetsons” and “Lost in Space” and movies like “2011: A Space Odyssey” . Disney’s Tomorrowland had long queues of visitors imagining the soon-to-arrive marvels of space travel, robots, jetpacks and artificial intelligence in the home.

While many of the predictions of the era were just too far-fetched some would eventually prove close to the mark, but of these all were way too ambitious in their timings.

While most of us in 2014 don’t travel by jetpack or have a robot to bring us our slippers, revolutionary change did eventually occur with the development of the global internet. However the disruption occurred in many iterative phases, with many false starts, even seeing a boom and a bust, but certainly over a much longer period than the early hype predicted.

The predictions in the 60’s and 70’s that every home would soon have a robot were never realised but perhaps this is yet to happen, the South Korean Ministry of Information and Communication has predicted that every South Korean household will have a robot by between 2015 and 2020 and that by 2018 surgery will be routinely performed by robots, that government is investing heavily in a research and development program to make it happen, but whether the predictions are correct and in the predicted timeframe remains to be seen.

Disappearing innovation

Now, apply the Jack Dorsey test to the innovative packet switching technology developed in previous decades, as ubiquitous as TCP/IP has become today and as often as we use it and depend on it, it has disappeared from the daily thoughts of those who use it. Even though it forms the fabric behind the connectivity of the digital generation, most of us would rarely give TCP/IP a thought when we send an email, watch a YouTube video or update our Facebook status. TCP/IP protocols helped to deliver this page to you but you probably didn’t think about this when you started to read. The application of the Jack Dorsey test would bring us to the conclusion that TCP/IP is one of those ‘best’ technology innovations that has disappeared just the way that Jack described it should.

The cloud hyperbole – is history repeating?

John McCarthy should be heralded as the founder of cloud computing, one of the first believers and pilgrims, in 1961 he predicted cloud computing, presenting the concept of ‘utility computing’ that was one day going to change the world, some fifty years later those days have begun to unfold.

John McCarthy’s utility computing, today paraded under the banner of Cloud Computing is now having a bigger, faster impact than TCP/IP in that it represents the converged maturity of a much broader spectrum of technologies already used by millions of people. But just like the predecessor innovation of TCP/IP, cloud too has been surrounded by much hyperbole and bold predictions.

However in the case of cloud computing the innovation has been accompanied by an army of IT vendors with cashed-up marketing mega-machines aiming to stir up a cloud frenzy, all with the objective to get a bigger slice of the opportunity to sell more tin, software and services. Can we blame the suppliers? No, the cloud hype came at a time when the industry needed a boost after the years of the GFC.

Behind the hype, there is no dispute that the cluster of technological innovations behind cloud computing have already started to change the world, but not everything is going as fast or to the scale of what has been predicted. Some may recall the early cloud predictions in 2009 that the majority of businesses would not have any in-house IT at all within a few years? And a high profile CIO was reported to say in 2010 that with cloud his firm would “never need to buy a server, rack or data centre again”. Fast forward to 2014 and that firm continues to buy all those things, so, while these predictions hold legitimacy in their underlying concepts, in many organisational settings these predictions are not even close to realisation.

The predictions of “wholesale” enterprise adoption are taking much longer to realise than the early hype suggested. The same way that we are waiting to see robots in every home, some of the bold cloud predictions are taking much longer to play out.

In the enterprise space the complexities of the obstacles in the way of wholesale cloud adoption take their toll as the years tick by, and while the technology continues to mature the demand side takes much longer to work out the how, why and when of cloud adoption. Why are things slower than predicted? Perhaps the reason is that the hype focuses on the maturity of the technology and ignores the maturity levels of the enterprise environment and the capacity or even the appetite, for bringing on the disruptive business change that wholesale cloud adoption initiates.

In the 1969 eureka moment with computer networking, the computer scientists knew they unlocked a door that would lead to further technological advancement, but they believed that things would progress much quicker than it eventually did. Similarly today, as the technology of cloud matures and on the surface appears to be ready for mass adoption, many cloud pundits are scratching their heads wondering why wholesale enterprise adoption remains elusive. Why are enterprises only making cloud moves with the components that sit on the fringe of the enterprise IT organisation? i.e. email, collaboration, web sites, open data, test and development environments, point solutions and proof of concepts?

Why is there very little heavy lifting by enterprises into these cloud environments?

Perhaps the hype has always been too focused on the cleverness of the technology, the carriage as it were, without enough focus on the content. Even the cleverness of the technology itself doesn’t solve the complexity of the consumption environment or the commercial hurdles needed to be overcome.The cleverness of the technology can always be revered but needed now is the translation into tangible business value and meaningful services that can actually be implemented with the least business risk, making a significant commercial difference.

While some technologists understandably place a lot of value in the technology stack and what it can achieve, the business is trying to survive in a harsh marketplace faced with a lot of hard truths, like this one: the market only values execution in the delivery of tangible business value. The substance of value lies in the content delivered and not in the carriage, despite the fact that the carriage makes it happen and a lot easier than was possible before. The gift is always much more valuable than its packaging.

Needed now is the ability to pull apart the tight grip of ageing legacy solutions that have many businesses in a tight hand hold, the legacy that is just not ready for cloud migration. Needed is the ability for unravelling the “IT Hairball” that has taken 30 years to develop that business is choking on. While cloud is the solution, the problem is still the same, and it has always been how to get from current state to future state.

Even with the cloud promise the business is still faced with the same challenge: the legacy system has to be remediated, updated, replaced in the transformation process and this still comes with the same levels of cost, risk and disruption.

Time for cloud to disappear

Just like what occurred with TCP/IP, cloud computing will bring much of the predicted change but over a longer period than predicted. There is a point in time when cloud should begin to disappear to become part of the underlying fabric that connects the world with its data, the customers with information, the business with delivery of services and digital consumers with their content.

As it disappears, the focus needs to finally shift away from technology and onto services, business innovation, real and tangible business value that makes a difference – such as developing new and better ways of delivering granular and just-in-time information to customers no matter where they are. This is what matters to most. In this context the sooner that “cloud” disappears the better.

Cloud and multi-sourcing brings a focus on vendor relationships

While enterprise IT is being bombarded with disruptive changes like cloud, BYOD, mobility and big data the traditional challenges also remain, such as replacing or remediating scores of ageing legacy systems that have become deeply embedded within the engine room of the organisation.

As enterprise IT is being pulled in every direction many will wisely pursue expanded sourcing strategies that include cloud, to help cope with the rapid rate of change.
As enterprises move deeper into diverse multi-sourcing scenarios, a heightened dependency follows on being able to effectively manage these relationships to deliver against key business objectives and contribute towards a stronger competitive differentiation.

Continue reading Cloud and multi-sourcing brings a focus on vendor relationships

What to do when the cloud business case just doesn’t stack up

When I was consulting on a cloud readiness assessment I was speaking with the CEO who told me that they had already undertaken an evaluation of whether to move their corporate email to a SaaS solution and the evaluation showed that the cost of moving to cloud far exceeded the cost of the in-house service and therefore the business case just didn’t stack up.

This is an extract of the conversation that I had at that time…

The CEO said, “The results of the review showed that compared with our internal email service the SaaS solution would be twice as expensive and half the speed!” he said.

“Do you believe that?” I asked.

“Well No” he replied.

“Why not?” I said, as he now had my curiosity.

“Well the review was undertaken by the IT Manager who looks after email and storage solutions and quite frankly we aren’t sure that he was…. shall we say being all that objective!”

Continue reading What to do when the cloud business case just doesn’t stack up

Cloud hype wanes while risk concern remains.

For a long time the hype around cloud computing has been more prevalent than the adoption with the enterprise take-up on any significant scale seeming to be stalling and up against some significant barriers.

More recently it seems that the hype around Cloud Computing is finally settling down and the discussion is being replaced by a more common sense dialogue on how cloud can be best leveraged as a legitimate component of the enterprise sourcing strategy.
Much of my work continues to be around assessing enterprise readiness for cloud, but typically cloud is being absorbed into the wider discussion around ‘outsourcing’, ‘right sourcing’, governance and internal vendor management capability.

Continue reading Cloud hype wanes while risk concern remains.

So what happened to the Salesforce data centre in Australia?

data_centerHigh profile sales in the public sector may be eluding Salesforce.com and perhaps this can be attributed to the concerns held by public sector CIOs about data sovereignty and the Australian Privacy Act.
With high profile breaches of shared service cloud data centres in the news, it suggests that the market has some way to go before the public sector will be ready to move sensitive data and IP to an overseas data centre run by a US or foreign owned company. Continue reading So what happened to the Salesforce data centre in Australia?

Why Business-as-usual will sink the CIO

The title of CIO has been around for around three decades, and many commentators have taken turns at suggesting that the three letters stand for various things, ranging from the positive, Chief Innovation Officer, to the negative, Career Is Over.
If I was going to take my turn at renaming the CIO title I would not come up with something as predictable but rather suggest that the CIO acronym needs to stand for “Continuous Improvement Opportunist” Continue reading Why Business-as-usual will sink the CIO

Prosumers deliver the real future shock for CIOs

When the futurist Alvin Toffler introduced the notion of the ‘Prosumer’ in the early 1980s he said it was the “progressive blurring of the line that separates producer from consumer”.

Toffler described a new age of prosumption as the arrival of a new form of economic and political democracy, with self-determined work, labour autonomy and autonomous self-production. Of course his concept of prosumption was centred on consumers taking a more active role in the production of “mass customized” products, hence the name prosumer as a mix of producer and consumer.

While Toffler was thinking about new approaches to industry production he may not have realised just how disruptive the prosumers that he began to describe would eventually become during the digital age.

Continue reading Prosumers deliver the real future shock for CIOs

Planning for Cloud Computing means preparing for outages

A video blog by Scott Stewart, Researcher and Industry Analyst, with the discussion being the recent Amazon outage and what it means for your cloud strategy.
Stay tuned for more video blogs coming!

[youtube=http://www.youtube.com/watch?v=bjDfMKlhOcc&feature=plcp]