Jun 2019

Read the latest edition of AIR and MEIR as an Interactive e-book

IT in Insurance - The Internet of Things: Observations and predictions

Source: Asia Insurance Review | Aug 2016

Unsolvable core business problems can now be solved; data transmission cost continues to fall faster than storage cost. These are some of the observations Mr Daniel Angelucci from CSC makes. He shares what the Internet of Things (IoT) means for the insurance industry.
The Internet of Things. No phrase gets both the IT and Insurance industry salivating quite like this one. The excitement is palpable; the hype cycle is in full swing; and everyone is writing articles about what it means. I want to make three observations and three predictions regarding IoT with respect to the Insurance industry. First, some observations:
Core business problems are being solved through IoT
Interestingly, IoT helps to “solve” problems which have a specific mathematical definition. These “NP” problems have no algorithmic solution but can easily be verified once solved. 
   One example of this type of problem is the core of public key cryptography. There is no known algorithm to determine a private key based on a public key. Once a solution is proposed, it is easily verified. That is, if you have the public key, it works to decrypt data and it is obvious that it works. In order for hackers to find the private key, they must make many, many attempts and see if these attempts solve the problem. In other words, they must accumulate massive amounts of data in order to determine the solution.
   What does this mean for the insurance industry? There are a series of operational problems which are true “NP” problems. There is no known algorithm that takes into account the layers of complexity associated with these problems, but by collecting massive amounts of data, these can be solved. What was once unsolvable has become solvable. 
Cost of data transmission continues to fall
The cost in this case represents the actual cost plus the risk premium. 
   Transmission of data has historically been risky, whether represented by messengers running dispatches between armies or sending messages across less than reliable wires. It has nearly always been cheaper to store data. This, in my opinion, has recently changed. 
   The reliability of modern data networks and the significant drop in costs means that transmission has become cheaper and less risky. This explains the rise of cloud services as well as the rise of IoT. As the cost of transmission continues to decline faster than storage, we should expect to see real-time event management, the collection of telemetry for real-time insights, move from the “core”, where the data is stored, to the “edge” where we can act quickly and definitively to optimise business outcomes.
Central data hub for enterprise services
Digital organisations, the future look of insurance organisations, have an extreme bias toward data-centricity. The IT department is becoming the custodian for the data which drives changes in enterprise services. Moving the bits is as important as any function within the organisation. 
   Fortunately, the methodology for managing enterprise services has largely been developed within IT organisations. We in the technology field are used to collecting telemetry and then gleaning insights, finding and reacting to events. We also have learned to proactively making changes based on trend data. We need to learn to apply these Service Management principles across the business landscape. 
   Every business service has business events which occur in real time and business trends which are discernible through analysis. Leveraging this IT capability is the key for enterprises to move to digital insurance, increasing both customer satisfaction and profitability.
   Based on these observations, what conclusions can we draw about the future world of IoT with respect to insurance? I will make three predictions in regards to the future and what they mean for insurance. 
Cost of sensors continue to fall relative to storage
What does this mean for the future? 
   The current model for data transmission is to move data from the edge of the network (where the sensors live) to the data centre where the data is then analysed and correlated. The rise of cheaper sensors, which talk to each other as well as to a centralised data store, creates an opportunity for truly autonomous event management at the edge of the network. 
   Like the body’s central nervous system, no longer does every action require a “brain” to process information before action can be taken. Instead, our smarter sensors will be able to act like a reflex. Essentially, known event patterns can be recognised and immediate actions can be taken. 
   We are moving from the area of on-disk analytics, to in-memory analytics, and now to “on-network” analytics. For insurance, being able to provide real-time pricing based on user activities without reliance upon centralised data processing means additional efficiencies. Further, as more and more insurance customers are outfitted with FitBits and the like, the ability to give real-time feedback introduces the possibility of intervening to truly better the lives of customers.
   Interestingly, how we utilise trend data will change. The storage of data is likely to become the exclusive realm of risk-based decision support. The goal of the analysis for stored data will not be to identify actionable insights, but rather to solve the previously unsolvable issues associated with risk. We know that certain actions increase risk to customers and to insurers, but can we quantify those precisely enough to increase the effectiveness of our risk management processes? I believe we can, and this will be the exclusive realm for analytics processes that utilise stored data.
Moving to a “edge to edge” model
As we begin to build more autonomous networks, security and non-reputability become critical. How do we ensure that the devices which compose our network are the devices we control and manage? 
   Our current approaches involve using certificates or other validation means. These approaches rely on a data transmission model which runs from edge to core. For example, a central certification authority acts to validate the data transmission through the use of a certificate trust. That central authority resides in a data centre. New models of network intelligence do not rely upon the data centre, and since we expect actions in real-time, they simply cannot rely upon this. Instead we need to develop security models which work “in the field” near the sensors.
   This has enormous consequences for the insurance industry. The model for trust relationships that these sensors require is one that very easily translates into a customer-to-agent interaction, or even a customer-to-customer interaction. 
Think about how we determine the concept of “trust” in our everyday lives. If an agent asks me to sign a policy document, I may have my wife or a trusted third party to validate my signature. The fact that we trust each other makes the signature valid. 
   As we move into the world of “security at the edge”, we need to deploy trust models that look more like this. This points to some of the “shared ledger” techniques found in technologies like Blockchain. The ability to break our dependence on centralized authority for non-reputability will be critical in securing and relying upon edge networks.
Communications will focus on overcoming saturation
Anyone who has ever tried to get their emails at a crowded conference has experienced the frustration of data saturation. 
   Our current transmission technologies have focused almost exclusively on increasing speed. Next generation networks will begin to focus elsewhere. New transmission protocols will focus on the ability to cram more data into a given space. 
Imagine a room where every device, from the couch to the oven, is transmitting data in real time, and imagine that room in a large apartment building. Our new protocols must support this scenario given that more and more of our things will become “smart things.”
   So what does this mean for the insurance industry? This means the ability to capture greater volumes of data regarding customers, agents or others. This means developing greater and greater capabilities to build highly functioning IoT networks which remediate events in real time. And this means developing more and more capabilities to solve previously unsolvable “NP” problems with data.
   The Internet of Things is something anyone in technology has an obligation to be conversant in. Insurance organisations which learn to exploit the continuing trends will be the ones who survive. While my predictions may not all come true, I believe that digital insurance leaders must at least address the possibilities current trends portend. The big winners will be those who can see around corners and stay at least one step ahead of the technology curve.
Mr Daniel Angelucci is Chief Technology Officer at CSC Asia, Middle East & Africa.
| Print | Share

Note that your comment may be edited or removed in the future, and that your comment may appear alongside the original article on websites other than this one.


Recent Comments

There are no comments submitted yet. Do you have an interesting opinion? Then be the first to post a comment.