Tag Archives: Innovation

Some innovations to watch

There is a lot of talk about hot new innovations like 3D printing but there are always a large number of less-know innovations that are trying to break through into mainstream as well. While lesser known, the follow are an extract of some that are interesting and certainly ones to watch:

Haptics (weight shifting handsets, electrovibration, augmented reality)

3D imagery and content

  • There are a number of emerging technologies that allow for 3D viewing of content on screens without the need for the 3D glasses. These technologies can be easily incorporated into future generations of smartphones. The technologies available today are:
    • The Parallax barrier which is a filter that goes over the screen to direct the different images to the correct eye, allowing us to then form that combined image and view the phone’s display in 3D.
    • Corrected, convergent barrier – unlike displays built on parallex barrier, this allows more than one user to view 3D effects on a smartphone at the same time.
    • Lenticular lens technology – has slits in a barrier system that work by blocking the right eye from seeing the left image and blocking the left eye from seeing the right image therefore, each eye is seeing the correct image, resulting in a perception of depth. Read more online: http://www.brighthand.com/default.asp?newsID=17285&p=2

Contextual awareness

  • Context awareness is predicted to fundamentally change the nature of interactions with information devices and services. A number of factors are contributing to the expected rapid emergence of contextual awareness such as:
  1. increased processing power of devices
  2. improved connectivity
  3. new and innovative sensing capabilities

Enhanced use of presence

  • With the increasing number of multiple communication channels the use of presence information will increase as a solution to facilitate better communications and reduce time wasted with failed attempts to communicate.
  • As more users make their presence information available for subscription by others new opportunities will emerge for enhanced collaboration and communication experiences. Read more online: http://en.wikipedia.org/wiki/Presence_information

Greater use of voice and tone recognition

  • Voice biometrics works by digitizing a profile of a person’s speech to produce a stored model voice print, or template.
  • Biometric technology reduces each spoken word to segments composed of several dominant frequencies called formants.
  • Each segment has several tones that can be captured in a digital format. T
  • he tones collectively identify the speaker’s unique voice print. Voice prints are stored in databases in a manner similar to the storing of fingerprints or other biometric data.
  • Not only can these technologies be applied to verify a person’s identity but tone recognition can be used to detect a persons health or emotional state.
  • This technology will open new opportunities for more responsive or interactive experiences for users of technology.
  • Read more online: http://www.globalsecurity.org/security/systems/biometrics-voice.htm

Intelligent routing to devices

  • Future technology services and applications are expected to incorporate location, presence and context awareness that combine to provide intelligent routing capabilities.
  • A good example of this is may be an application provided by a local council that allows constituents and staff to pinpoint council issues while on the move.
  • These applications will take advantage of the features of smartphones that can take photos and have GPS support, to provide the council with the precise description and location of street-based issues. Intelligent routing then enables these problems to be reported instantly to the responsible team.

Laser based displays within eyewear

Brain wave sensing

Eye tracking and response monitoring

  • Eye tracking technology monitors brainwaves and responses of individuals in a number of capacities.
  • The process involves the use of a device for measuring eye positions and movements, which are then analysed through computer applications.
  • Eye tracking technology is used in a variety of medical, scientific and research fields and is gaining popularity.
  • Read more online here: http://www.videoeyetracking.com/eye_tracking_technology.html

Gaming interfaces and controls to run business applications

  • Universities and research centres are exploring new dimensions in computer-Human interfaces.
  • With the increase in the use of gaming amongst the younger generation the gaming interface is being considered as a potential interface for business applications.
  • Whether it be new input devices that have learned from the gaming console design or gaming software engines applied to test business process models and workflows.
  • Expect that this research will continue to search for commercial application.

Intelligent computing in the device

  • Computer chips that could mimic human brain functionality are close to commercial reality.
  • IBM recently announced an “unprecedented” step forward in creating intelligent computers that collect, process, and understand data quickly.
  • The prototype chips incorporate a simplified model of the human brain, which has billions of neurons and trillions of synapses.
  • IBM said the chips are the foundation of what could eventually be a “mammalian-scale system,” which will include 10 billion neurons and 100 trillion synapses with the power consumption and size that rivals the human brain.
  • Read more online: http://www.dailygalaxy.com/my_weblog/2011/09/a-giant-step-closer-to-intelligent-computers-that-mimic-human-brain-based-on-a-mammalian-scale-syste.html

Virtual worlds collaboration

  • Virtual worlds will emerge as an important component in the next generation of collaboration tools for the workplace.
  • Some analysts predict that the 3D web will, in five years, be as important for work as today’s web.
  • Read more online here: http://www.leadingvirtually.com/?p=26

Twitter: @CIOMatters


The sooner that cloud disappears, the better

Jack Dorsey said it best..

“I think the best technologies disappear, they fade into the background, and they’re relevant when you want to use them, and they get out of the way when you don’t.

It would be fair to say that Jack knows quite a bit about how to build success with technology, he is the co-founder and chairman of Twitter and also founder and CEO of Square, both of these companies appear on the Fast Company’s list of the world’s top 50 most innovative companies. But why would Jack suggest that the best technology will disappear? Why didn’t Jack say that the best technology is found in the social networks or the very application services that have made him so successful today? Perhaps Jack knows that the answer to these questions sits within the lessons learned from history and in particular the birth of the internet.

That famous eureka moment

On the 29th of October 1969, Charley Kline, a UCLA graduate student, sat in UCLA lab and typed the first message on the ARPANET which was the progenitor of what was to become the global internet. The team at UCLA were able to use their computer to log in to a different computer that sat in a lab at the Stanford Research Institute, some 400 miles away connected by a cable. This was a massive breakthrough as it proved the theoretical feasibility that different types of computers could be made to communicate using packets rather than circuits – a major step along the path towards computer networking. The computer scientists had developed a packet switching protocol called NCP which would allow the networking of different kinds of hardware and software. Eventually NCP would be replaced by TCP/IP in 1983 and soon after that would come the birth of the modern-day internet. The breakthrough of 1969 seemed to validate the earlier predictions made by computer scientists:

“In a few years, men will be able to communicate more effectively through a machine than face to face, that is rather a startling thing to say, but it is our conclusion.“ J.C.R. Licklider and Robert S. Taylor, 1968, The Computer As A Communications Device

It seems reasonable to envision, for a time 10 or 15 years hence, a ‘thinking center’ that will incorporate the functions of present-day libraries together with anticipated advances in information storage and retrieval.” J.C.R. Licklider, Man-Computer Symbiosis, 1960.

If computers of the kind I have advocated become the computers of the future, then computing may someday be organized as a public utility just as the telephone system is a public utility. The computer utility could become the basis of a new and important industry.” John McCarthy, in a speech delivered at the MIT Centennial, 1961.

The early hyperbole

The breakthrough in computer networking triggered a lot of excitement and fuelled a wide spread belief that in just a few short years incredible disruptive changes would be seen across modern society. This new ability for different machines to talk to each over long distances gave birth to a string of bold predictions, such as an impending explosion of computing networks linking society with new artificial intelligence, even resulting in robots in the home doing the housework. Workers from the seventies have told of newsreels linking the technology breakthrough with shorter hours and less work as much of the daily office grind would be taken over by computers. These newsreels showed people at the beach and on picnics while the computer tape-reels were spinning to reduce the need for humans to work five days a week.

The hype played to the imaginations of the populous, it certainly fuelled the imagination of those creative minds who wrote television shows like the “Jetsons” and “Lost in Space” and movies like “2011: A Space Odyssey” . Disney’s Tomorrowland had long queues of visitors imagining the soon-to-arrive marvels of space travel, robots, jetpacks and artificial intelligence in the home.

While many of the predictions of the era were just too far-fetched some would eventually prove close to the mark, but of these all were way too ambitious in their timings.

While most of us in 2014 don’t travel by jetpack or have a robot to bring us our slippers, revolutionary change did eventually occur with the development of the global internet. However the disruption occurred in many iterative phases, with many false starts, even seeing a boom and a bust, but certainly over a much longer period than the early hype predicted.

The predictions in the 60’s and 70’s that every home would soon have a robot were never realised but perhaps this is yet to happen, the South Korean Ministry of Information and Communication has predicted that every South Korean household will have a robot by between 2015 and 2020 and that by 2018 surgery will be routinely performed by robots, that government is investing heavily in a research and development program to make it happen, but whether the predictions are correct and in the predicted timeframe remains to be seen.

Disappearing innovation

Now, apply the Jack Dorsey test to the innovative packet switching technology developed in previous decades, as ubiquitous as TCP/IP has become today and as often as we use it and depend on it, it has disappeared from the daily thoughts of those who use it. Even though it forms the fabric behind the connectivity of the digital generation, most of us would rarely give TCP/IP a thought when we send an email, watch a YouTube video or update our Facebook status. TCP/IP protocols helped to deliver this page to you but you probably didn’t think about this when you started to read. The application of the Jack Dorsey test would bring us to the conclusion that TCP/IP is one of those ‘best’ technology innovations that has disappeared just the way that Jack described it should.

The cloud hyperbole – is history repeating?

John McCarthy should be heralded as the founder of cloud computing, one of the first believers and pilgrims, in 1961 he predicted cloud computing, presenting the concept of ‘utility computing’ that was one day going to change the world, some fifty years later those days have begun to unfold.

John McCarthy’s utility computing, today paraded under the banner of Cloud Computing is now having a bigger, faster impact than TCP/IP in that it represents the converged maturity of a much broader spectrum of technologies already used by millions of people. But just like the predecessor innovation of TCP/IP, cloud too has been surrounded by much hyperbole and bold predictions.

However in the case of cloud computing the innovation has been accompanied by an army of IT vendors with cashed-up marketing mega-machines aiming to stir up a cloud frenzy, all with the objective to get a bigger slice of the opportunity to sell more tin, software and services. Can we blame the suppliers? No, the cloud hype came at a time when the industry needed a boost after the years of the GFC.

Behind the hype, there is no dispute that the cluster of technological innovations behind cloud computing have already started to change the world, but not everything is going as fast or to the scale of what has been predicted. Some may recall the early cloud predictions in 2009 that the majority of businesses would not have any in-house IT at all within a few years? And a high profile CIO was reported to say in 2010 that with cloud his firm would “never need to buy a server, rack or data centre again”. Fast forward to 2014 and that firm continues to buy all those things, so, while these predictions hold legitimacy in their underlying concepts, in many organisational settings these predictions are not even close to realisation.

The predictions of “wholesale” enterprise adoption are taking much longer to realise than the early hype suggested. The same way that we are waiting to see robots in every home, some of the bold cloud predictions are taking much longer to play out.

In the enterprise space the complexities of the obstacles in the way of wholesale cloud adoption take their toll as the years tick by, and while the technology continues to mature the demand side takes much longer to work out the how, why and when of cloud adoption. Why are things slower than predicted? Perhaps the reason is that the hype focuses on the maturity of the technology and ignores the maturity levels of the enterprise environment and the capacity or even the appetite, for bringing on the disruptive business change that wholesale cloud adoption initiates.

In the 1969 eureka moment with computer networking, the computer scientists knew they unlocked a door that would lead to further technological advancement, but they believed that things would progress much quicker than it eventually did. Similarly today, as the technology of cloud matures and on the surface appears to be ready for mass adoption, many cloud pundits are scratching their heads wondering why wholesale enterprise adoption remains elusive. Why are enterprises only making cloud moves with the components that sit on the fringe of the enterprise IT organisation? i.e. email, collaboration, web sites, open data, test and development environments, point solutions and proof of concepts?

Why is there very little heavy lifting by enterprises into these cloud environments?

Perhaps the hype has always been too focused on the cleverness of the technology, the carriage as it were, without enough focus on the content. Even the cleverness of the technology itself doesn’t solve the complexity of the consumption environment or the commercial hurdles needed to be overcome.The cleverness of the technology can always be revered but needed now is the translation into tangible business value and meaningful services that can actually be implemented with the least business risk, making a significant commercial difference.

While some technologists understandably place a lot of value in the technology stack and what it can achieve, the business is trying to survive in a harsh marketplace faced with a lot of hard truths, like this one: the market only values execution in the delivery of tangible business value. The substance of value lies in the content delivered and not in the carriage, despite the fact that the carriage makes it happen and a lot easier than was possible before. The gift is always much more valuable than its packaging.

Needed now is the ability to pull apart the tight grip of ageing legacy solutions that have many businesses in a tight hand hold, the legacy that is just not ready for cloud migration. Needed is the ability for unravelling the “IT Hairball” that has taken 30 years to develop that business is choking on. While cloud is the solution, the problem is still the same, and it has always been how to get from current state to future state.

Even with the cloud promise the business is still faced with the same challenge: the legacy system has to be remediated, updated, replaced in the transformation process and this still comes with the same levels of cost, risk and disruption.

Time for cloud to disappear

Just like what occurred with TCP/IP, cloud computing will bring much of the predicted change but over a longer period than predicted. There is a point in time when cloud should begin to disappear to become part of the underlying fabric that connects the world with its data, the customers with information, the business with delivery of services and digital consumers with their content.

As it disappears, the focus needs to finally shift away from technology and onto services, business innovation, real and tangible business value that makes a difference – such as developing new and better ways of delivering granular and just-in-time information to customers no matter where they are. This is what matters to most. In this context the sooner that “cloud” disappears the better.

Why Business-as-usual will sink the CIO

The title of CIO has been around for around three decades, and many commentators have taken turns at suggesting that the three letters stand for various things, ranging from the positive, Chief Innovation Officer, to the negative, Career Is Over.
If I was going to take my turn at renaming the CIO title I would not come up with something as predictable but rather suggest that the CIO acronym needs to stand for “Continuous Improvement Opportunist” Continue reading Why Business-as-usual will sink the CIO