Business Rules Re-Imagined

The Impact of Cloud, AI/ML and RPA and on Decision Management

Decision making is a key component of today’s business applications. Complex business applications must be able to make decisions following the same rules a human would follow when making those same decisions.

Because business applications are so critical –  and because they need to incorporate business know – how into the applications to support accurate decision making – the building of these applications is no longer left solely to IT developers. Software development is seeing increasing involvement from the business side. 

For example, in the past an insurance company would write an application that records insurance claims. Today, IT is writing applications to sell insurance. That is a huge change. In computer science class, developers are not taught how to sell insurance. And it is not like the insurance company lacks that expertise anyway. They have many people who know how to sell insurance, but none of them work for IT. So more businesses are realizing that they need to involve the business stakeholders in the software development process, and incorporate more business knowledge directly into these applications. 

Clearly, the business stakeholder are not developers. They are not going to write code, but they can produce models of their business, based on the business rules, the processes, the policies, and the decisions they make while conducting company business. These models can be thought of as the source code that will be deployed within the business apps. 

How does IT empower the business user to encode as much of their knowledge as possible so they can add value to the applications the organization depends on? Development teams should utilize business rules rather than simply encoding business logic into applications. When business logic is built into the application – which is the traditional way of developing business applications – the business stakeholders have no visibility. It is hard for them to see what specific policies are being applied; what the rules are; and why a decision is made. 

The solution is to outsource the decision making to a business rules engine, which makes decisions on behalf of the application, based on a set of rules written in English so the business side can understand. Business can gain much more control over the policies that are applied automatically by these applications.

A business rules engine provides multiple advantages, including:

  • Separation of the business rules from the applications
  • Visibility for business stakeholders
  • Business rules expressed in terms that the business can really understand
  • Enabling business and IT experts to collaborate more effectively
  • The ability to change rules easily and quickly
  • Consistency of rules and decision-making

A range of Red Hat customers across various industries are having great success using a rules engine to provide decision services to the organization and keeping business rules separate from application code.

A lot has changed in IT over the past few years that has impacted automated business decision making. In this blog, I will cover three of the major changes that have caused the greatest impact: Cloud, Artificial Intelligence (AI)/Machine Learning (ML), and Robotic Process Automation (RPA).

The Cloud

The cloud has dramatically altered the way we create, deploy, monitor and manage business applications, and that has a tremendous impact on people using business rules engines.

Since the start of IT, organizations have built “Monolithic” applications, which are large, difficult to understand, and challenging and time-consuming to make a change. In the last few years, thanks to the cloud, monolithic apps have been replaced by containers, and microservices architecture, making it easier to create, deploy, manage and change parts of applications. A large app can be broken into smaller components that can be developed, scaled, managed and changed independently from the other components.

The flexibility of containers and microservices simplifies decision-making because rules can be added as components rather than being embedded in the application. In the cloud, rules can modified easily in response to changes in the market or in the way the company does business, and the entire application does not have to be rewritten. This provides businesses with a level of agility they never had before.

Artificial Intelligence (AI)/Machine Learning (ML)

Another set of emerging technologies that is having a major impact on business applications is Artificial Intelligence (AI) and Machine Learning (ML). 

One way of defining ML is “rules that write themselves.” This is obviously a huge leap from where we were 10 years ago. Instead of creating the rules based on your experience and building an app based on those rules, you can now look at historical data, figure out what rules produced those results, and then apply that knowledge to make decisions going forward.

Essentially ML is used for constructing predictive models. For example, with regard to the rules covering when insurance claims are denied or paid, you have data about how you arrived at decisions in particular cases. You could use ML to build predictive models to repeat those decisions in similar cases. 

How do predictive models compare with user written rules in terms of their usefulness and viability for these types of decision applications

  • Rules are created by people, while predictive models are automatically created based on analysis of historical data
  • Rules produce results that are explainable, while predictive models produce results that are not explainable.
  • Rules are subject to human error. One of the challenges with rule-based systems is you get gaps – such as situations you forgot about, or edge cases you did not address. Conversely, predictive models are subject to historical bias. You are limited to reproducing behavior apparent from the training data. So if the data is biased, you get a biased model. 
  • Rules take a significant amount of time to produce, while it is relatively quick to generate a predictive model once you have the data.

The goal when using ML to augment decision making is to combine the advantages of applied rules and predictive models. New rule languages, like Decision Model and Notation (DMN), make this possible by simplifying the process of creating rules.

In the past, rules were created in a complex notation. Thousands of rules leave open many possibilities for error. DMN is a graphical language for encoding rules that make a decision. It makes it much easier for business stakeholders to create the source code for their decision applications. They create a graph to encode complex business logic. Then they can incorporate a predictive model into the DMN diagram, and they can incorporate business rules as well, effectively combing the best of both options

Robotic Process Automation

Robotic Process Automation (RPA) is an exciting, fast moving space right now. With RPA, software robots are developed to perform routine, repetitive work that would otherwise be done by a human worker. The advantages of RPA are all about reducing cost and headcount by automating tasks.

Time spent copying information from a back office database into a spreadsheet, and moving data around from system to system – this is not the type of work where a human worker adds value to the process, but it still has to be done. RPA allows an organization to automate those repetitive tasks. 

One of the best advantages of RPA is that it enables you to automate work without having to change your underlying systems. You can keep your legacy mainframe and other applications exactly as they are. The robot pretends to be a human and carries out the same tasks exactly the same way a human worker would. 

But it is very important to consider RPA as just another software development approach. In future, as RPA evolves, I would expect to see containerized robots roaming around your hybrid cloud. But for now, RPA is just another app and it needs to be managed just like any other app. Version control and QA are very important. 

When you create a bot, beware of the attack surface. You create an opportunity for someone to add a few lines of script to that bot. Think about the damage that could be inflicted by an automated bot with high level access to all your enterprise data. That is a key reason why RPA should be subject to the same governance as any other software.

Also it is very important to understand that the value of RPA is in the ability to automate human work, not to patch holes in IT systems. If you find yourself building bots to fix holes in IT, you really need to take a good look at the infrastructure instead. It is not productive to use bots as band aids, because the bots themselves will continuously break as things change within the infrastructure. It is more productive to focus on fixing the source of the issue in the underlying infrastructure.

For RPA to remain relevant and continue to support software development, bots should be compatible with the cloud, and be able to run in containerized environments. This is technology we expect to see in the next few years or so.

The Red Hat Solution

Red Hat is very active in the software development space and offers a range of tools designed to solve the challenges associated with incorporating rules and decision making into business applications:

  • Red Hat Decision Manager is a platform for developing containerized microservices and applications that automate business decisions. Decision Manager includes business rules management, complex event processing, and support for building DMN models.
  • Red Hat Process Automation Manager is a platform for developing containerized microservices and applications that automate both business decisions and processes. Process Automation Manager includes business process management (BPM), business rules management (BRM), and business resource optimization and complex event processing (CEP) technologies. It also includes a user experience platform to create engaging user interfaces for process and decision services with minimal coding. 
  • For development in the Cloud, Red Hat OpenShift is an enterprise-ready Kubernetes container platform with full-stack automated operations to manage hybrid cloud and multicloud deployments 
  • Red Hat Runtimes is a set of products, tools, and components for developing and maintaining cloud-native applications. It offers lightweight runtimes and frameworks for highly-distributed cloud architectures, such as microservices. 

Red Hat at VoxxedDays Microservices

 

Microservices are no longer a playground and developers and architects have adopted them and re-architected applications to reap their benefits, while others have also deployed to production. Voxxed Days Microservices is an exciting place to share those experiences and also listen to what Red Hat has been doing in this space with Microprofile, Thorntail, SmallRye and more. As you might know, Voxxed Days Microservices 2018 will take place in Paris on the 29th October and Red Hat will be there together with the Eclipse Foundation. We also have an exciting line up of sessions and also a presence at the booth for any interesting conversations, questions and discussions on the topic.

 

Here is a list of sessions:

 

Session Title Date and Time (CET)   
Keynote: Distant past of Microservices Monday 09:15
Ask the Architects Monday 18:15
Data in a Microservices world: from conundrum to options Monday 14:30
Data Streaming for Microservices using Debezium Tuesday 14:30
Thorntail – A Micro Implementation of Eclipse MicroProfile Tuesday 11:15
What is SmallRye and how can you help? Tuesday 13:00

 

There will also be a full day hands-on workshop on Wednesday, October 31st for those interested in learning more about Microprofile.

 

Come by the Eclipse MicroProfile booth and pick up some swag!

 

For more information:

Voxxed Days Microservices conference program at-a-glance

The path to cloud-native applications: 8 steps

Red Hat OpenShift Application Runtimes

Understanding cloud-native apps

Understanding middleware

Thorntail.io

MicroProfile.io

 

MicroProfile/Thorntail presence by Red Hat at CodeOne 2018

This is Oracle’s first edition of their CodeOne (October 22-25) conference (née JavaOne), which expands to a variety of runtimes beyond Java, among other things. Red Hat will be present at the conference delivering workshops, keynotes, and sessions on a variety of topics.  As a leader in open source, Java, cloud, containers, microservices and cloud-native Java, Red Hat will host a series of talks on our implementation of MicroProfile using the open source project Thorntail. Here is a list of our MicroProfile-related sessions:

Date and Time (US PST)                                                           Session Title
10/23/2018 20:30:00 @ Moscone West – Room 2018 Eclipse MicroProfile: What’s Next?
10/22/2018 12:30:00 @ Moscone West – Room 2018 Cloud Native Java EE with MicroProfile
10/25/2018 11:00:00 @ Moscone West – Room 2011 Thorntail: A Micro Implementation of Eclipse MicroProfile
10/22/2018 13:30:00 @ Moscone West – Room 2008 Building a Fault-Tolerant Microservice in an Hour
10/24/2018 16:00:00 @ Moscone West – Room 2018 CDI from Monolithic Applications to Java 11 jlink Images

 

There will also be other MicroProfile-related sessions delivered by members of the community.

Come and visit us at the Red Hat (booth #5401) and pick up your swag!

For more information:

Red Hat sessions at Oracle CodeOne 2018

The path to cloud native applications: 8 steps

Red Hat OpenShift Application Runtimes

Understanding cloud-native apps

Understanding middleware

Thorntail.io

MicroProfile.io

 

MicroProfile/Thorntail presence by Red Hat at EclipseCon Europe 2018

EclipseCon Europe is fast approaching and many open source users, vendors, corporations, and developers will converge on October 23-25, 2018 in Ludwigsburg, Germany for the once-a-year event that brings the latest trends in IoT, web and cloud development, Java and Java development toolkits, tools and IDEs, and cloud-native Java. As a leader in open source, Java, cloud, containers, microservices, and cloud-native Java, Red Hat will be present with a series of talks  on our implementation of MicroProfile using the open source project Thorntail. Here is a list of our MicroProfile-related sessions:

Date and Time (CET)         Session Title
23 Oct 2018 – 15:15 Path to Cloud-native Application Development: 8 steps
24 Oct 2018 – 14:00 Thorntail – A Micro Implementation of Eclipse MicroProfile
24 Oct 2018 – 16:30 Distributed Tracing for MicroProfile Runtimes
25 Oct 2018 – 10:45 Cloud Native development with Eclipse MicroProfile on Kubernetes

 

There will also be other MicroProfile-related sessions delivered by members of the community.

Red Hat will also participate in the Community Day on Monday, October 22, 2018 with workshops, “meet the spec” talks and process and implementation discussions related to MicroProfile and Jakarta EE.

Come and visit us at the Red Hat booth (booth #1) and pick up your swag!

For more information:

Red Hat sessions at EclipseCon Europe 2018

The path to cloud native applications: 8 steps

Red Hat OpenShift Application Runtimes

Understanding cloud-native apps

Understanding middleware

Thorntail.io

MicroProfile.io

 

From BPM and business automation to digital automation platforms

The business processes that create customer value are the critical piece that links together all of the different aspects of digital transformation. But still, many of the critical activities that contribute to it are either manual or a succession of disconnected workflows or applications that prevent organizations from having an end-to-end view of how their processes deliver customer value.  

Evolving from workflows to BPM – business process management – added a whole collaborative layer and execution structure to the traditional hierarchy and project-based structure of the enterprise. When it was paired with access to the critical data and documents, alongside activity visibility and business rules, it helped to exponentially grow productivity and agility in the enterprise for many years.

Nowadays, enterprises have discovered already how to use these technologies and apply them to work with structured and unstructured processes, to create business rules to guide and support decision making, or the importance of integrating process outputs and inputs in real time to external systems that interact with the processes. These process-centric applications are even cloud-ready so you can run your processes in the cloud and open them up more securely to all of your internal and external collaborators.

But times are changing. Productivity and agility are no longer the name of the game. It is no longer enough to provide ease of use, business, and IT collaboration, or fast modification of processes and rules. Speed and support for digital transformation have become top priorities. Those process-based applications need to be quickly deployed into production, be portable, reusable and consistent across environments, and scaled in the hybrid cloud. Our customers expect cloud-native technologies at the core of their processes. They expect to run their process workloads to scale across the hybrid cloud to provide a consistent experience to their customers and collaborators. Ideally, they also want to future-proof their investments with modern technologies such as containers.

Continue reading “From BPM and business automation to digital automation platforms”

Meet application integration in the times of hybrid cloud

The concept of agile integration, depending on whom you ask, may appear as a contradiction in terms. Integration is a concept that used to be associated with “slow,” “monolithic,” “only to be touched by the expert team,” etc.. Big and complex legacy enterprise service buses connected to your applications were the technology of choice at a time when agility was not a requirement, when the cloud was barely an idea, when containers were associated with maritime shipping and not with application packaging and delivery.

Can the principles of agile development be combined with those of modern integration? Our response is yes, and we call it  agile integration. Let me show you what it is, why it is important, and what we at Red Hat are doing about it.

Software development methodologies have evolved rapidly in the last few years to incorporate innovative concepts that result in faster development cycles, agility to react to changes and immediate business value. Development now takes place in small teams, changes can be approved and incorporated fast to keep track of the changing demands of the business, and each iteration of the code has a product as the ultimate result. No more need for longer development cycles and never-ending approvals for changes. And importantly, business and technical users join forces and collaborate to optimize the end result.

In addition, modern integration requires agility, cloud-readiness, and support of modern integration approaches. In contrast with the legacy, monolithic ESBs, modern integration is lightweight, pattern-based, scalable, and able to manage complex, distributed environments. It has to be cloud-ready and support modern architectures and deployment models like containers. It also has to provide integration services with new, popular technologies, like API management, which is becoming the preferred way to integrate applications and is at the core of microservices architectures. And support innovative and fast evolving use cases such as the Internet of Things (IoT).

Continue reading “Meet application integration in the times of hybrid cloud”

Red Hat present at EclipseCon France 2018

EclipseCon France is taking place this week in Toulouse, France (June 13-14, 2018) and it’s offering a great lineup of top-notch sessions on nine different tracks, from IoT to cloud and modeling technologies. This year, there is even a dedicated track for “Microservices, MicroProfile, EE4J and Jakarta EE,” which is covering topics such as Istio, 12-factor apps, geoscience, machine learning, noSQL database integration, cloud-native application development, security, resilience, scalability, and the latest statuses of the Jakarta EE and MicroProfile open source specification projects. Under this track, we are hosting two sessions:

But we are also delivering other interesting sessions under the “Reactive Programming” track:

Under the “IoT” track:

Under the “Eclipse IDE and RCP in Practice” track:

And, under the “Cloud & DevOps” and “Other Cool Stuff” tracks:

For those of you that will be at the conference, we invite you to attend the sessions above and to stop by the Red Hat booth to learn how Red Hat can help your organization solve your IT challenges (and get your swag too!). And for those of you that would like to learn more about Red Hat offerings in relation to the topics above, please visit the following links:

Red Hat OpenShift Application Runtimes: Delivering new productivity, performance, and stronger standards support with its latest sprint release

Red Hat OpenShift Application Runtimes is a collection of cloud-native application runtimes that are optimized to run on OpenShift, including Eclipse Vert.x, Node.js, Spring Boot, and WildFly Swarm. In addition, OpenShift Application Runtimes includes the Launch Service, which helps developers get up and running quickly in the cloud through a number of ready-to-run examples — or missions — that streamline developer productivity.

New Cache Booster with JBoss Data Grid integration

In our latest continuous delivery release, we have added a new cache mission  that demonstrates how to use a cache to increase the response time of applications.  This mission shows you how to:

  1. Deploy a cache to OpenShift.
  2. Use a cache within an application.

The common use case for this booster is to cache service result sets to decrease latency associated with data access as well as reduce workload on backend service.  Another very common use case is to reduce the data volume of message send across in distributed system.

Continue reading “Red Hat OpenShift Application Runtimes: Delivering new productivity, performance, and stronger standards support with its latest sprint release”

Red Hat makes Node.js a first-class citizen on OpenShift with RHOAR, by Conor O’Neill, nearForm

Red Hat’s offering in cloud-native application development has just taken another step forward with the announcement of supported Node.js. Conor O’Neill from our partner nearForm shares his thoughts on the role that Node.js and Red Hat OpenShift Application Runtimes (RHOAR) will take in Red Hat’s market leadership in Cloud-Native application development, modernization and migration.

Read more here: Red Hat makes Node.js a first-class citizen on OpenShift with RHOAR, by Conor O’Neill, nearForm

Luis I. Cortes. Senior Manager, Middleware Partner Strategy – @licortes_redhat

Cloud Native Application Development – Adopt or Fail

In today’s digital world, software strategy is central to business strategy. To stay competitive, organizations need customized software applications to meet their unique needs — from customer engagements to new product and services development. Drawn-out development projects are no longer acceptable, given business demands. Therefore, the need to speed up application development, testing, delivery, and deployment is no longer optional but a must-have competency.   

At the same time that developers are confronting this challenge to deliver solutions more quickly, they are also facing the most diverse technology ecosystem in the history of computing.  To address this challenge, development teams must modernize architecture, infrastructure, and processes to deliver higher-quality applications with greater agility.

Cloud native development is an approach to building and running applications that fully exploits the advantages of the cloud computing model.  Cloud native development multidimensionality involves architecture, infrastructure, and processes based upon four key tenets:

  1. Services-based architecture: could be microservices or any modular loosely coupled model for independent scalability and flexibility of maintenance and polyglot language runtimes.
  2. Containers and Docker image: as the deployment unit and self-contained execution environment with consistency and portability across cloud infrastructures.
  3. DevOps automation: implementing processes and practices and instrumentation of development to test deployment of applications.     
  4. API-based design: The only communication allowed is via service interface calls over the network. No direct linking, no direct reads of another team’s data store, no shared-memory model with an outside- in perspective.

Continue reading “Cloud Native Application Development – Adopt or Fail”