Cognition-as-a-Service: Rent Learning Machines Inside the Cloud

sean gourley scereenSummary: Siri and IBM Watson for everyone? Organizations might soon be able to rent brain power by the hour from the cloud – creating a new marketplace for contextually-aware and adaptive applications powered by software programs that learn from real-world human user interactions.  Our educational and workflow experiences might soon access cogntive computing applications to learn more effectively and make better decisions on complex issues.

Companies are starting to experiment with business models to deliver cloud based cognition in the form of ‘contextual and cognitive computing’ via API Engines and stand alone software solutions.

Contextual & Cognitive Computing
Cognition as a Service is just a buzzword today.  It is far from defined! The most likely path towards bringing cognition level abilities to organizations will likely pass through stages.  The first will be an evolution towards contextual experiences followed by more sophisticated ‘IBM Watson’-like applications.

Contextual web experiences move us from information being delivered by ‘keyword’ connections to a more personalized sense of the right ‘context’ for our lives based on: location, activity, life experiences and preferences, et al.  Contextual experiences are more personalized and certainly ‘learn’ from interactions with users – but there is a new paradigm of ‘cognitive’ systems that elevate the experience.

Cognitive Computing is an era where software systems learn on their own and can teach themselves how to improve their performance.  IBM Watson is today’s most sophisticated cognitive application.  Watson is currently providing decision support for cancer treatments at Sloan Kettering, financial service support for companies such as ANZ and CitiGroup, and working to evolve the retail customer experience for Northface.

Cognition as a Service?
In practical terms this means an organization might be able to buy ‘as a service’ (e.g. not an application they paid to develop or maintain) – natural language processing for human-like Question & Answer interactions.

Companies pushing both of these capabilities are simultaneously trying to improve performance, integration and business model design.  Early industry adopters will range from health, finance, energy, education, et al.  Industries with connected data and a need for augmenting human knowledge building. Early adopters industries will likely have lots of data and compliance heavy regulatory frameworks.

 

The cloud-based business models of ‘software-as-service’ and ‘platform/infrastructure-as-service’ are the most likely first path for contextual and cognitive platforms.  Startups such as Expected Labs are bringing their API engine MindmeldAPI into the marketplace.  IBM Watson is hedging its bets by delivering stand alone solutions and also opening up its API to developers.

Developers are just now getting their hands dirty with advanced contextual computing applications. Truly transformational applications will likely emerge 2015-2025.

Companies to Watch:
AlchemyAPI, DeclaraIntelligent ArtificatsGrokSaffron, Stremor PlexiNLP  and Vicarious are companies with products designed to build knowledge graphs and natural language interactions that have contextual and cognitive computing style capabilities.

Video to Watch – Range of Business Models? 

Sean Gourley – Founder of Quid has been wrapping his head around the future of mankind working ‘with’ machines for several years.  I’ve seen Sean’s talk evolve over two years– and it continues to be among the most solid framings of this massive transition towards ‘augmented intelligence.

This is a great talk by Sean Gourley from the March 2014 GigaOm Data Structure Conference.

***

I’ve posted Expected Labs Mindmeld videos here

***

Looking for something more mainstream and business oriented:

IBM Waston Playlist here 

 

 

 

Garry’s Diigo Tags on: IBM Watson;

 

It is not a ‘Driverless Car’, It is the Beginning of Captain Culture

Summary: The ‘driverless car’ headlines are misleading!  Humans ‘drivers’ will evolve into ‘Captains’ and still play a critical role in the age of connected cars and autonomous vehicles.  Similar to airline pilots who Captain largely automated planes, humans will soon contribute less to decisions on acceleration, braking, and steering and more thinking and control over higher order operations of smarter, connected vehicles.  The testing ground Captain-like experiences will be the near term transition of connected cars with ‘active assist’ vehicle systems like adaptive cruise control and platooning.   >> One scenario for this Captain-style command and control culture is that it becomes regulated and led by insurance companies to insure humans learn how to use advanced systems safely.  The Drivers License might evolve into a Captain’s license 😉 

 

Background
Self-driving cars suitable for real-world operation are closer to reality than most people might believe. Bold claims of bringing autonomous vehicles to markets by 2020 have been made by Nissan and Daimler. Across the world, transportation agencies are outlining the roadmaps and regulatory frameworks needed to support testing and commercialization.  Insurance companies are figuring out risk guidelines to deal with liabilities and inevitable incidents within this new autonomous age.

>>>
Who Flew the Plane? The Captain or Computer
Think about the last time you were on an airplane.  You boarded, buckled your belt, paid very little attention to the safety instructions then put your faith in the human Captain to get you to the destination safely and on time! Even though the plane was largely controlled by automated systems, we felt it was the human Captain in charge.

Who will ‘Drive’ the Connected Cars and Autonomous Cars of the Future?
We are not entering the age of ‘driverless cars’, it is the transition to the era of ‘Captain’ culture where human thinking and vehicle operation moves up the value chain of new forms of command and control.

In the years ahead we will gradually share and cede ‘control’ of acceleration and braking, and gain a sense of responsibility over higher level thinking around active assist systems.

The age of Active Assist is defined by vehicle systems that can sense problems and alert the driver — or sense-and-control the vehicle to avoid an incident. Active Assist includes features such as adaptive cruise controllane-depature warning systems, collision warning and collision avoidance systems. Many of these features are already in the marketplace of luxury vehicles but not yet mainstream or part of our popular culture.

I love the idea of autonomous vehicles but don’t expect people to just fall in love with self-driving cars overnight. It is the era of active assist where our relationship with vehicles will evolve.

Captains are looking at dashboards of information and recommendation systems.  They are tuned into geospatial (map-based) information on infrastructure and other connected cars.  Captains will be looking at mission critical software from LIDAR to smart tire sensors.  Captain will be watching for notifications of changing conditions (e.g. accidents; road debris) coming from connected and sensing vehicles across the road networks. Being Captain of a smarter vehicle might seem more human than being a ‘driver’ of cars that we know of today.    

These are possible higher order thinking activities associated with Captain culture but the scenario is not inevitable.   

Captain Era: Risks, Rewards & Responsibilities
There are considerable risks including an over reliance on computing systems, loss of skills, software system failures, malware attacks, et al.  The assumption of a Captain role is to maintain awareness and attention to respond to system failures and resume control.

The rewards are improving safety and flow. We can imagine a dramatic reduction of deaths and injuries as human error is dampened by active assist and autonomous vehicles.  We can imagine commuting within major metropolitan regions moving more smoothly with ‘flow’ being the most desirable condition.  We might not go as fast as we want – but flow means we will not see stop and go traffic patterns.

The responsibilities require us to be more transparent and accountable.  Folks who fear big brother will not be happy in this future.

Critique: This Captain Culture is Nonsense
There is another scenario (or vision) of the age of autonomous vehicles where humans do not ‘have’ to pay attention – or they will not ‘want’ to pay attention.  The assumption might be humans are lazy or not-interested.  Why would someone want to oversee advanced active assist systems?  

This critique is perfectly reasonable, but I think the answer to the question is: ‘because they will have to pay attention’.

I would expect over the next twenty years for transportation regulators to force humans to stay engaged and attentive.  In-cabin sensing systems will know if people within autonomous or active assist vehicles are not paying attention.

If I had to place a bet – it would be that the Captain culture era is regulated into social norms.  There is simply too much risk to confront given the stage of maturity of both active assist and autonomous vehicles.

 

Videos on Autonomous Vehicles & Active Assist

http://www.youtube.com/watch?v=LHqB47F12vI

 

Bosch – Automated Driving

 

Where to watch now: high performance  auto racing

 

Milestones to Watch – Imagine

  • DMV Grants first Captain License

 

Image Use: CC Steve Jurvetson

 

Background on Autonomous Vehicles 

Driverless Cars (=We don’t trust humans!) vs Active Assist (=We love people!)
Self-driving cars suitable for real-world operation are closer to reality than most people might believe. Bold claims of bringing autonomous vehicles to markets by 2020 have been made by Nissan and Daimler. Across the world, transportation agencies are outlining the roadmaps and regulatory frameworks needed to support testing and commercialization.  Insurance companies are figuring out risk guidelines to deal with liabilities and inevitable incidents within this new autonomous age.

The most empowering headlines would be those that frame the transition in a ‘pro’-human factors and talk more about empowering  humans than making them seem irrelevant.  We can find the human factors within vehicles based on active assist.

The age of Active Assist is defined by vehicle systems that can sense problems and alert the driver — or sense-and-control the vehicle to avoid an incident. Active Assist includes features such as adaptive cruise controllane-depature warning systems, collision warning and collision avoidance systems. Many of these features are already in the marketplace of luxury vehicles but not yet mainstream or part of our popular culture.

I love the idea of autonomous vehicles but don’t expect people to just fall in love with self-driving cars overnight. It is the era of active assist where our relationship with vehicles will evolve.

How will the role of people evolve in this near term transition towards active assist?  What is the ideal image of human operators in this new era of software assisted driving?  What would be the worst case outcome?

Garry delivers keynotes, workshops and consultation for organizations around the world! Lets talk about how he can help yours.