How would you like to be a Guest Blogger for KMI? Email us at: info@kminstitute.org and let us know your topic(s)!

The Connection between AI and KM - Part 3 - Cognitive Computing Technology

September 14, 2017

In part one I examined the connection of KM and AI and how this connection has led the way for cognitive computing; while in part two I examined those industries that will or are soon to be disrupted by Cognitive Computing.  In this post I will examine those technologies that will lead in the disruption brought to many industries by the way of cognitive computing.

Cognitive computing is the simulation of human thought processes in a computerized model. Cognitive computing involves self-learning systems (Artificial Neural Network machine learning algorithms) that use data mining, pattern recognition and natural language processing to imitate how humans think. The goal of cognitive computing systems is to accelerate our ability to create, learn, make decisions and think.

According to Forbes, “cognitive computing comes from a mashup of cognitive science and computer science.” However, to understand the various aspects of this mashup we must peel back the various components of cognitive computing. These components are centered within AI and KM. The components of cognitive computing enable these applications to be trained in order to recognize images and understand speech, to recognize patterns, and acquire knowledge and learn from it as it evolves producing more accurate results over time.

Cognitive Technologies

Cognitive technologies have been evolving since I started developing AI applications (Expert Systems and Artificial Neural Networks) in the late 1980’s and early 1990’s. Cognitive technologies are now a prominent part of the products being developed within the field of artificial intelligence.

Cognitive computing is not a single technology: It makes use of multiple technologies and algorithms that allow it to infer, predict, understand and make sense of information. These technologies include Artificial Intelligence and Machine Learning algorithms that help train the system to recognize images and understand speech, to recognize patterns, and through repetition and training, produce ever more accurate results over time. Through Natural Language Processing systems based on semantic technology, cognitive systems can understand meaning and context in a language, allowing deeper, more intuitive level of discovery and even interaction with information.

The major list of cognitive technologies solutions include:

Expert Systems, Neural Networks, Robotics, Virtual Reality, Big Data Analytics, Deep Learning, Machine Learning Algorithms, Natural Language Processing, and Data Mining

Various cognitive technologies or applications are being developed by many organizations (large, small, including many startups). When it comes to cognitive technologies, IBM Watson has become the most recognized. IBM Watson includes a myriad of components that comprise the Watson eco system of products.

Companies Delivering Cognitive Solutions

Here are a few companies delivering cognitive solutions that take advantage of the cognitive technologies mentioned above as well as the industry they focus on.

Industry: Healthcare

Welltok: Welltok offers a cognitive powered tool called CaféWell Concierge that can process vast volumes of data instantly to answer individuals’ questions and make intelligent, personalized recommendations. Welltok offers CaféWell Concierge to health insurers, providers, and similar organizations as a way to help their subscribers and patients improve their overall health.

Industry: Finance

Vantage Software : provides reporting and analytics capabilities to private equity firms and small hedge funds. The company’s latest product, Coalesce, is powered by IBM Watson’s cognitive computing technology. This is an example of a company developing a software platform and using IBM Watson’s API’s to provide cognitive capabilities. This product addresses the need to absorb and understand huge volumes of information and use that information to make split-second, reliable decisions about where and when to invest client funds in a highly volatile market.

Industry: Legal

One of the major impediments to quality, affordable legal representation is the high cost of legal research. The body of law is a growing mountain of complex data, and requires increasingly more hours and manpower to parse. Lawyers are constantly analyzing data to find answers that will benefit their clients. For law firms to stay competitive they must find ways to cut cost and streamlining legal research is one way to do just that.

ROSS Intelligence: software is built on the Watson cognitive computing platform, ROSS has developed a legal research tool that will enable law firms to slash the time spent on research, while improving results.

AI & Blockchain

Detailing AI, KM and Cognitive computing would not be complete without adding blockchain to the technologies that will disrupt several industries. Functionally, a blockchain can serve as “an open, distributed ledger that can record transactions between two parties efficiently and in a verifiable and permanent way. The ledger itself can also be programmed to trigger transactions automatically. AI & Blockchain come together when analyzing digital rights. For example, AI will learn the rules by identifying actors who break copyright law. The use of AI applications will be extended by incorporating blockchain technology. When blockchains scale to encompass big-data, AI will provide the query and analysis engine to extract insights from the blockchain of data.

Cognitive technology solutions can be found in a number of applications across many industries. These industries include but are not limited to legal, customer service, oil & gas, healthcare, financial and automotive just to name a few. Cognitive technologies have the potential to disrupt Every industry and Every discipline — Stay Tuned!!

 

Maximizing and Measuring User Adoption

August 30, 2017

Similar to the old adage, “you can lead a horse to water, but you can’t make him drink,” you can deliver a solution that uses the most cutting-edge technology and beautiful design, but you can’t guarantee that your stakeholders will embrace it. This blog offers practical tips on how to maximize and measure user adoption to ensure that your new tool or process is fully embraced by those for whom you’ve designed it.

To deliver a project success story backed with quantitative and qualitative data to support it, you should take an objective-first approach to change management. This requires a shift in focus from what the change is (e.g. the implementation of a new tool or process) to what you aim to achieve as a result of the change (e.g. increased productivity or improved work satisfaction). Rather than only highlighting the features of the new technology, you’ll want to focus on the benefits the users will gain from using it. Taking this approach is particularly critical for Knowledge Management initiatives, which are initially often met with skepticism and a broad sense of concern that there’s not enough time in the already busy day to acclimate to another new tool or process. By following these guidelines, you’ll be able to say “our users love the new tool and they are so much more effective and efficient as a result of it…” and “here’s the data to prove it.”

The way to accomplish this is by setting “SMART” objectives at the start of your project and developing an anaytics strategy that will help you measure your progress towards achieving those objectives. These objectives should clearly express desired changes in user behavior and the impact these new behaviors are expected to have on overall productivity and effectiveness. In the words of Stephen Covey, “start with the end in mind” so that all your efforts are aligned towards achieving your expected results.

Let me put this into context using one of my current projects. I’m working with a global manufacturing organization to design and implement a tool that will help the communications department work in a more centralized and collaborative way. The team is responsible for delivering content about events, programs, and news items to internal employees as well as external stakeholders. The team is used to working in silos and each team member uses different tools for storing, sharing, and finding information such as a basic team site, email, and desktop file folders.

From the very beginning of the project, change management has been a priority. We knew that if we wanted the communications department to adopt the new tool, we had to think of ways to encourage them to do so well in advance of them even having contact with it. Here are ways to apply what my team has done to your change effort to help you maximize and measure user adoption:

Step 1: Align your metrics with desired outcomes

To encourage a more centralized and collaborative way of working for the communications department, we’re using Microsoft O365 tools such as MS Teams, MS Planner, and modern SharePoint team sitesas a platform for the new system. We chose this suite of tools because it offers various features that, if used, could save the department a lot of time, reduce wasted effort, and ultimately elevate their role to a more strategic partner within the organization.

Here’s how we’ve expressed our primary objective:

“Increase the team’s efficiency by managing all campaign content, including digital assets, in the new tool within 90 days of launch.”

When content is stored in various places, not everyone has access to the latest versions. This causes a lot of confusion and re-work. The challenge is that people defer to the processes they’re most used to, which is often saving information in their local drives and sharing it via email. The new behavior we wanted to encourage was saving information in a centralized location (in this case a SharePoint team site), so that everyone has access to the latest version, edits are being made to the same copy, and there’s a tracking history of the edits, as well as who made them.

The objectives you identify will vary depending on the challenges you’re trying to solve, so your success metrics should be aligned accordingly. In this case, defining our objective leads us to what we should measure: the percentage of campaign content that is stored and shared in the tool vs. outside of it.

Step 2: Capture baseline metrics and keep it simple

In order to be able to tell a story about the impact of a new tool, you need baseline metrics for comparing your results. For this project, we had three categories of metrics and different approaches for capturing each:

  • Satisfaction Level: We deployed a survey that measured how useful users found their current system.
  • Proficiency Level: We deployed another survey that measured their self-rated proficiency levels with basic SharePoint functionality such as uploading and sharing documents.
  • Usage Level: We tracked activity on the system after launch. This includes number of active users, number of documents and multimedia files saved and shared via the tool, and number of interactions in the conversations space.

The key here is to keep it simple. We designed the surveys to be short and to the point, and only asked specific questions that would help inform the decisions we made on the project. We also didn’t measure everything. We kept it basic to start and the longer the users had to engage with the system, the more sophisticated our metrics became.

Step 3: Take actions that lead to measurable improvements

Our satisfaction survey, along with in-depth user analysis and testing, informed the features we included in our new tool. As we were prioritizing the features, we kept our objectives in mind. It was critical for us to ensure our tool had a separate space for managing content for each campaign. This space had to make it easy for the team to upload, edit, share, and find content, including text-based and multimedia assets.

Our proficiency survey helped us to design the training for the new tool. Had we made the assumption that our users were already familiar with SharePoint’s basic functionality, we would have gone into our training sessions ready to introduce all of its advanced features. Knowing that the team members were not as confident in their SharePoint abilities led us to design a basic SharePoint prerequisite training session for those that needed it. Meeting users at their proficiency level and guiding them towards the level they need to be to make the most of the new tool’s features prevents them from being so discouraged that they abandon the new tool prematurely. (Get more helpful tips on user training by watching Rebecca’s video, Top 5 Tips for Using Training to Promote Adoption).

This is important because we planned to deploy the satisfaction and proficiency survey again once we launched the new tool. Taking actions based on the results of the baseline survey created measurable improvements in how much the users liked the new tool(s) they were using and how confident they were in using it.

Step 4: Measure again once you’ve implemented your solution

This may seem like common sense, but let your users know that the tool is now available for them to use and train them how to use it! Often, the team members heavily involved in the project assume that users know it exists and will intuitively learn how to use it on their own. The team building the tool has spent the past few months or so immersed in the tool, so they are likely to overestimate other people’s awareness of the tool and underestimate the learning curve associated with it.

In our case, our baseline usage level was 0 team members because the tool was brand new. Our goal was to increase usage level to all 30 team members. Our strategy for getting all 30 team members to use the tool, rather than relapsing back to their old habits and systems, was the deployment of “early and often” messages about the tool, along with thorough training for each team member we expected to use it. Long before the tool was launched, we built excitement and awareness around the new tools via a teaser video, Yammer posts, emails, and messages from leadership during team meetings. Once the tool was launched, we conducted live training sessions and delivered helpful resources and guides.

Along the way, we were asking:

  • What percentage of the team watched the teaser video?
  • How many team members saw the Yammer posts? How many “liked” it, replied to it, or shared it?
  • How many of the team members heard and saw the presentation?
  • Did the team members react positively or negatively to the messages in the video, posts, and presentations?
  • How many of the team members completed the optional pre-work and basic training?
  • How many of the team members attended the live training sessions?

All of these metrics were indicators of the degree to which the users would adopt the new tool. You can then validate these indicators by measuring actual adoption, e.g. user activity within the tool and their satisfaction in using it.

Step 5: Give it some time, then measure again

As we were building the tool, the project team discussed how we were going to tell our success story. But, that really depended on how we defined our success. For us, did success mean that we launched the new tool on schedule and under budget? Or, did it mean that the communications team members were embracing the new tool and way of working? The latter for us was much more important so we developed a timeline for capturing feedback: one week after launch, one month after launch, 3 months after launch, and 6 months after launch. During these set time periods, we would capture metrics around how satisfied they are with the new tool and its impact on their work and how proficient they felt with their new skill sets. In addition to self-reported data, we would also track usage metrics such as what percentage of the team actively manages their campaign within the tool vs. outside of it.

Summary

Organizations invest large amounts of money on new technology with the intentions of improving employee productivity. The key to getting a significant return on these investments is to make sure your project team has what it takes to define, drive, and measure success. If you want to make sure the next solution you roll-out maximises user adoption and produces measurable results, contact Enterprise Knowledge at info@enterprise-knowledge.com.

The Connection between Artificial Intelligence and Knowledge Management - Part 2

July 31, 2017

The Disruption of Cognitive Computing

This is the second of a three-part post on the connection between Artificial Intelligence (AI) and Knowledge Management (KM). In this post I examine those industries that will or are soon to be disrupted by AI and KM, specifically in the form of Cognitive Computing.

Before we look ahead, let’s take a look back. During the time I first became involved in AI (late 80’s), its hype and promise at that time became too much to live up to (a typical phenomenon in software - see Hype Cycle) and its promise faded into the background. Fast forward to 2010 and AI is beginning to become the “next big thing”. AI had already made its presence felt in the automobile industry (robotics), as well as with decision making systems in medicine, logistics, and manufacturing (expert systems and neural networks). Now AI in the form of Cognitive Computing is making its mark on several industries. In a recent CB Insights Newsletter, it was stated that the US Bureau of Labor Statistics indicates that 10.5 million jobs are at risk of automation. Due to the rapid adoption and application of better hardware processing capabilities which facilitate artificial intelligence algorithms use on big data this is leading the change in blue and white collar jobs.

At a recent Harvard University commencement address, Facebook Chief Executive Mark Zuckerberg stated “Our generation will have to deal with tens of millions of jobs replaced by automation like self-driving cars and trucks."

Bill Gates, the founder of Microsoft and Chairman of the Bill and Melinda Gates Foundation in a recent MarketWatch story had this to say “In that movie, old Benjamin Braddock (Dustin Hoffman) was given this very famous piece of advice: 'I just want to say one word to you. Just one word …Plastics.'  And today? That word would likely be 'robots,' and 'artificial intelligence' would have a huge impact."

Although there are many industries where Cognitive Computing will disrupt the way business is conducted including the economics around job loss and future job creation, I have chosen to look at three industries; Legal Services, the Automotive Industry, and Healthcare.

Legal Services

Knowledge Management (KM) is becoming more prevalent within law firms as well as legal departments as the practice of KM has become more mature. AI technologies are also making its way into the practice of law. Ability to reuse internally developed knowledge assets such as precedents, letters, research findings, and case history information is vital to a law firm’s success. Paralegals currently play a critical role in assisting attorneys with discovery. With the use of AI systems attorneys will be able to “mine” more accurately and efficiently the large volumes of documents (i.e., precedents, research findings, and case history information) located in various repositories to aid in decision making and successful client outcomes. This ability will limit the use of paralegals and attorneys currently needed to perform these tasks.

Cognitive computing will enable computers to learn how to complete tasks traditionally done by humans. The focus of cognitive computing is to look for patterns in data, carrying out tests to evaluate the data and finding results. This will provide lawyers with similar capabilities as it provides doctors; an in-depth look into the data that will provide insights that cannot be provided otherwise. According to a 2015 Altman Weil Law Firms in Transition survey 35% of law firm leaders indicate cognitive computing will replace 1st year associates in the next ten (10) years. While 20% of law firm leaders indicate cognitive computing will replace 2nd and 3rd year attorneys as well. In addition, 50% of law firm leaders indicate cognitive computing will replace paralegals altogether. Cognitive computing capability to mine big data is the essential reason lower level research jobs will be replaced by computers. This situation is not just limited to the legal profession.

Automotive Industry

Autonomous Vehicles and Vehicle Insurance

Autonomous vehicles, also known as a driverless car, robot car (here we go with robots again!), and self-driving car can guide themselves without human intervention. This kind of vehicle is paving the way for future cognitive systems where computers take over the art of driving. Autonomous Vehicles are positioned to disrupt the insurance industry. Let’s take a look at what coverages are a part of the typical vehicle insurance policy.

Vehicle insurance typically addresses six coverages. These coverages include:

  • Bodily Injury Liability, which typically applies to injuries that you, the designated driver or policyholder, cause to someone else;
  • Medical Payments or Personal Injury Protection (PIP), which covers the treatment of injuries to the driver and passengers of the policyholder's vehicle;
  • Property Damage Liability, which covers damage you (or someone driving the car with your permission) may cause to someone else's property;
  • Collision, which covers damage to your car resulting from a collision with another car, object or and even potholes;
  • Comprehensive, which covers you for loss due to theft or damage caused by something other than a collision with another car or object, such as fire, falling objects, etc.;

Uninsured and Underinsured Motorist Coverage reimburses you, a member of your family, or a designated driver if one of you is hit by an uninsured or hit-and-run driver. The way these coverages are applied (or not) to a vehicle policy will be disrupted by the use of autonomous vehicles.

According to a 2016 Forbes article by Jeff McMahon about 90 percent of car accidents are caused by human error. However, it is estimated that autonomous vehicles will significantly reduce the number of accidents. This will significantly disrupt the insurance revenue model, affecting all six types of coverage identified above. When the risk of accidents drops, the demand for insurance will potentially drop as well (this will not happen unless the states no longer require insurance that covers accidents). So, there will be no doubt that auto insurance companies will change the type of coverage and the language affecting the policies.

Some Unintended Side Effects?

The autonomous vehicle with its multiple sensors has the potential to eliminate accidents due to distractions and drunk driving. This will disrupt the vehicle repair industry by largely eliminating crashes so collision repair shops will lose a huge portion of their business. Indirectly, the decreased demand for new auto parts will hurt vehicle parts manufacturers. According to the U.S. Department of Transportation in 2010 approximately 24 million vehicles were damaged in accidents, which had an economic cost of $76 billion in property damages. The loss of this revenue will put a strain on these manufacturers.

(to be continued)

The Connection between Artificial Intelligence and Knowledge Management

July 18, 2017

This is the first of a three (3) part post on the connection between Artificial Intelligence and Knowledge Management.

Artificial Intelligence (AI) has become the latest “buzzword” in the industry today. However, AI has been around for decades. The intent of AI is to enable computers to perform tasks that normally require human intelligence, as such AI will evolve to take many jobs once performed by humans. I studied and developed applications in AI from the mid to late 1980’s through the early 2000’s. AI in the late 1980’s and early 1990’s evolved into a multidisciplinary science which included expert systems, neural networks, robotics, Natural Language Processing (NPL), Speech Recognition and Virtual Reality.

Knowledge Management (KM) is also a multidisciplinary field. KM encompasses psychology, epistemology, and cognitive science. The goals of KM are to enable people and organizations to collaborate, share, create, use and reuse knowledge. Understanding this KM is leveraged to improve performance, increase innovation and expand what we know both from an individual and organizational perspective.

KM and AI at its core is about knowledge. AI provides the mechanisms to enable machines to learn. AI allows machines to acquire, process and use knowledge to perform tasks and to unlock knowledge that can be delivered to humans to improve the decision-making process. I believe that AI and KM are two sides of the same coin. KM allows an understanding of knowledge to occur, while AI provides the capabilities to expand, use, and create knowledge in ways we have not yet imagined.

The connection of KM and AI has lead the way for cognitive computing. Cognitive computing uses computerized models to simulate human thought processes. Cognitive computing involves self/deep learning artificial neural network software that use text/data mining, pattern recognition and natural language processing to mimic the way the human brain works. Cognitive computing is leading the way for future applications involving AI and KM.

In recent years, the ability to mine larger amounts of data, information and knowledge to gain competitive advantage and the importance of data and text analytics to this effort is gaining momentum. As the proliferation of structured and unstructured data continues to grow we will continue to have a need to uncover the knowledge contained within these big data resources. Cognitive computing will be key in extracting knowledge from big data. Strategy, process centric approaches and interorganizational aspects of decision support to research on new technology and academic endeavors in this space will continue to provide insights on how we process big data to enhance decision making.

Cognitive computing is the next evolution of the connection between AI and KM. In future post, I will examine and discuss the industries where cognitive computing is being a disruptive force. This disruption will lead to dramatic changes on how people will work in these industries.

KM: Lessons Learned from Pioneer 10

July 10, 2017

Given the 45th anniversary to of its launch this year, what can we learn about Knowledge Management from Pioneer 10?

Over 9 million miles away from our sun, a solitary spacecraft continues its long journey into interstellar space. Scientific investigations of great value were sent to Earth until March 31st, 1997, the official end of its main mission. Exhausted of all energy, it drifts along the solar winds of space, its last signal having been detected over 13 years ago. Some say that this is when its secondary mission truly began: To act as an ambassador to cosmic civilizations. To accomplish this, a gold-anodized aluminum plaque was attached to Pioneer 10. Carl Sagan and Frank Drake designed the message, with artwork prepared by Linda Salzman Sagan. 

 

This is the message:

Hydrogen, considered to be the most abundant element in the universe was chosen as it is a universal phenomenon and the hyperfine transition depicted could be used as a base for measuring time and distance, thus decoding the remainder of the plaque. The array of lines and dashes to the center-left of the plaque represent pulsars, which can be used to calculate the position of the sun to the center of the milky way galaxy. They can also be used to calculate when Pioneer 10 was launched. At the bottom of the plaque the solar system is depicted with the home planet of the plaque, and Pioneer 10’s trajectory out of our solar system. 

Lastly, a drawing of a man and a woman, their height able to be calculated based on the hyperfine transition key and a silhouette of Pioneer 10. 

First, let us go over what is “wrong” in the message: 

  • The frequency of one of the pulsars is calculated incorrectly. That error was made by Sagan and Drake because the pulsar could not be calculated as precisely in the 70s as it can be now.
  • The solar system is depicted incorrectly. We now consider our solar system to have 8 planets and 5 dwarf planets. Additionally, the arrow showing Earth and Pioneer 10’s trajectory may not be understood by aliens who did not “grow up” in a hunter-gatherer society. Lastly, Saturn has a dash through it, which was meant to signify its rings. A dash, based on the plaque and binary could be misinterpreted to mean the planet does not exist anymore or that some calamity occurred. 
  • The male and female portrayed are the most debated on the plaque. The major issue back in the 70s was that both were depicted without clothing. The drawings of both were meant to depict all races but many viewed them as Caucasian. The female was also perceived as being subservient to the male.
  • Lastly, the hand raised was considered a sign of greeting but aliens could consider this to mean “stop” or anything else for that matter.  

The “right” in the message:

  • Despite the change in our understanding of what we consider a planet, 8 of the largest bodies are represented. A small version of Pioneer 10 was depicted travelling from Earth outwards so it truly was the best representation at the time for its origin (and still is).   
  • The hand raised by the male had an additional use and that was to show the opposable thumb, which many consider to be one of the largest leaps in our evolution. If the female raised her hand too, it could be depicted that we all walked around that way, for the same reason her stance was a little different from her counterpart. 
  • It was simple and concise and could be interpreted by cultures at both ends of the technical spectrum. 

What does this have to do with Knowledge Management? 

  • When writing any type of Knowledge Base Article, keep things as simple as possible as even the simplest pieces could be interpreted differently. Wording should be concise and should “stick to the facts.” Do not create an article for the sake of creating it – the article must serve a purpose. 
  • Maintenance of Knowledge. Takes steps to ensure whatever is created can be maintained by having a process to do so. A Knowledge Base Article is a living thing. 
  • Your Knowledge Base Article will be there even if you are not. Only because one person created it does not mean that person “owns” it. Knowledge is for the betterment of all and not for a select few. 
  • Your Knowledge Base Article is an Ambassador for your company. Use spelling and grammar check and most importantly be professional in your writing.
  • Technology. While it is not the main force behind Knowledge Management, we cannot discount its power, so invest in the tools proven to increase successes. When writing an article, think about how a quick video, diagram, and other multimedia may help to supplement the document. The Voyager probes contained images and recordings while the New Horizons probe went all digital.