DecisionCamp 2019, Decision Manager, AI, and the Future

A few days ago my fellow Red Hatters Mario Fusco, Matteo Mortari, Mark Proctor, Donato Marrazzo and myself (Edson Tirelli) had the opportunity to attend Decision Camp 2019. Following the tradition from previous years, this is a conference focused on Decision Management and related topics, with an emphasis on practitioners, vendors and users of the technology. In other words, a 3-day conference that packs a lot of content, mostly technical and strategic.

This year in particular the agenda was packed full of interesting and relevant topics, ranging from human centric topics (like coordination of collaborative decisions), to compelling use cases (like airport gate scheduling), to glimpses of what is coming on the DMN standard (like temporal reasoning and discussions about DMN 2.0).

Red Hat delivered two presentations: 

  • Decision Management + Machine Learning: a standards based approach : Matteo Mortari and myself presented how the purposeful use of the triple crown standards (CMMN, BPMN and DMN) in combination with PMML (Predictive Model Markup Language) enables businesses to leverage machine learning to automate complex decisions in a vendor neutral and effective solution, while promoting transparency and simplifying explainability. 
  • How and Why I Turned a Rule Engine into a First-Class Serverless Component : Mario Fusco and Matteo Mortari presented how the use of the latest advancements in JVM, cloud and containerization technologies made it possible to re-architect the Drools project into a cloud-native, first-class serverless component.

You can find a good review of all the presentations on Sandy Kemsley’s blog. All the slide decks are also available for download from the Decision Camp website.

Trends in Enterprise AI and Digital Decisions

One of the highlights of the conference was the keynote presentation by Mike Gualtieri from Forrester, on “Trends in Enterprise AI and Digital Decisions”.

During the presentation he touched on several subjects, past, present and future, but the message was clear: effective Decision Automation (or Digital Decision, Business AI, or any name you choose to use) is not a one-trick solution. It requires a number of technologies to be combined in order to truly deal with today’s enterprise challenges. 

AI is a hot term in the market right now, but Machine Learning (ML) without the framework of Digital Decisioning (DD, also called Decision Management) and the optimizations from a Constraint Solver (that he called Mathematical Optimization (MO) engine), is too unpredictable and opaque to be effective. In his words: AI = ML + MO + DD.

His argument is that AI is not only a requirement, but inevitable, for any company looking to become a leader in their industry. He does caveat that statement by explaining he is referring to Pragmatic AI (or Applied AI), that is focused on solving specific problems, and not the Pure AI that we sometimes see on Science Fiction movies.

He continued then explaining that although Pragmatic AI (in the context of Machine Learning)  is a game changer, it is essentially a model, with probabilistic predictions as outputs and the quality of the result depends on both the model as well as the quality of the data used for training. Being essentially probabilistic, in rare occasions it can also produce outlier results that make no sense for the business, or, more often, answers that are sub-optimal.

That is where the combination with Mathematical Optimizations (MO) and Digital Decision (DD) technologies can hugely improve the success of these solutions. Machine Optimizations can be used to constrain or improve the results generated by ML predictive models, eliminating those outliers and improving the sub-optimal decisions. Digital Decisioning can leverage those solutions for explainable automated decisions, while keeping humans embedded on the process as DD is effectively a translation of human knowledge into executable computer models.

Lots of insights and food for thought, but also happy to see that his advice and observations match to some degree what we’ve been doing with the Red Hat Business Automation platform.

Red Hat Business Automation ships with a Mathematical Optimization engine (Optaplanner) and a Decision Engine (Drools) out of the box. Optaplanner leverages business rules for the specification of constraint and scoring rules, and can as well be used for decision optimization and resource planning. The big news we live demoed at Decision Camp is that starting with version 7.5, users can now transparently integrate Machine Learning models through PMML (Predictive Model Markup Language), without any glue code, directly into their DMN (Decision Model and Notation) models. That level of seamless integration is critical to reduce time to market, transparency and efficiency of automated decisions.

All in all, a great conference! Thank you to the organizers for putting together such a strong line up of content.

Business Rules Re-Imagined

The Impact of Cloud, AI/ML and RPA and on Decision Management

Decision making is a key component of today’s business applications. Complex business applications must be able to make decisions following the same rules a human would follow when making those same decisions.

Because business applications are so critical –  and because they need to incorporate business know – how into the applications to support accurate decision making – the building of these applications is no longer left solely to IT developers. Software development is seeing increasing involvement from the business side. 

For example, in the past an insurance company would write an application that records insurance claims. Today, IT is writing applications to sell insurance. That is a huge change. In computer science class, developers are not taught how to sell insurance. And it is not like the insurance company lacks that expertise anyway. They have many people who know how to sell insurance, but none of them work for IT. So more businesses are realizing that they need to involve the business stakeholders in the software development process, and incorporate more business knowledge directly into these applications. 

Clearly, the business stakeholder are not developers. They are not going to write code, but they can produce models of their business, based on the business rules, the processes, the policies, and the decisions they make while conducting company business. These models can be thought of as the source code that will be deployed within the business apps. 

How does IT empower the business user to encode as much of their knowledge as possible so they can add value to the applications the organization depends on? Development teams should utilize business rules rather than simply encoding business logic into applications. When business logic is built into the application – which is the traditional way of developing business applications – the business stakeholders have no visibility. It is hard for them to see what specific policies are being applied; what the rules are; and why a decision is made. 

The solution is to outsource the decision making to a business rules engine, which makes decisions on behalf of the application, based on a set of rules written in English so the business side can understand. Business can gain much more control over the policies that are applied automatically by these applications.

A business rules engine provides multiple advantages, including:

  • Separation of the business rules from the applications
  • Visibility for business stakeholders
  • Business rules expressed in terms that the business can really understand
  • Enabling business and IT experts to collaborate more effectively
  • The ability to change rules easily and quickly
  • Consistency of rules and decision-making

A range of Red Hat customers across various industries are having great success using a rules engine to provide decision services to the organization and keeping business rules separate from application code.

A lot has changed in IT over the past few years that has impacted automated business decision making. In this blog, I will cover three of the major changes that have caused the greatest impact: Cloud, Artificial Intelligence (AI)/Machine Learning (ML), and Robotic Process Automation (RPA).

The Cloud

The cloud has dramatically altered the way we create, deploy, monitor and manage business applications, and that has a tremendous impact on people using business rules engines.

Since the start of IT, organizations have built “Monolithic” applications, which are large, difficult to understand, and challenging and time-consuming to make a change. In the last few years, thanks to the cloud, monolithic apps have been replaced by containers, and microservices architecture, making it easier to create, deploy, manage and change parts of applications. A large app can be broken into smaller components that can be developed, scaled, managed and changed independently from the other components.

The flexibility of containers and microservices simplifies decision-making because rules can be added as components rather than being embedded in the application. In the cloud, rules can modified easily in response to changes in the market or in the way the company does business, and the entire application does not have to be rewritten. This provides businesses with a level of agility they never had before.

Artificial Intelligence (AI)/Machine Learning (ML)

Another set of emerging technologies that is having a major impact on business applications is Artificial Intelligence (AI) and Machine Learning (ML). 

One way of defining ML is “rules that write themselves.” This is obviously a huge leap from where we were 10 years ago. Instead of creating the rules based on your experience and building an app based on those rules, you can now look at historical data, figure out what rules produced those results, and then apply that knowledge to make decisions going forward.

Essentially ML is used for constructing predictive models. For example, with regard to the rules covering when insurance claims are denied or paid, you have data about how you arrived at decisions in particular cases. You could use ML to build predictive models to repeat those decisions in similar cases. 

How do predictive models compare with user written rules in terms of their usefulness and viability for these types of decision applications

  • Rules are created by people, while predictive models are automatically created based on analysis of historical data
  • Rules produce results that are explainable, while predictive models produce results that are not explainable.
  • Rules are subject to human error. One of the challenges with rule-based systems is you get gaps – such as situations you forgot about, or edge cases you did not address. Conversely, predictive models are subject to historical bias. You are limited to reproducing behavior apparent from the training data. So if the data is biased, you get a biased model. 
  • Rules take a significant amount of time to produce, while it is relatively quick to generate a predictive model once you have the data.

The goal when using ML to augment decision making is to combine the advantages of applied rules and predictive models. New rule languages, like Decision Model and Notation (DMN), make this possible by simplifying the process of creating rules.

In the past, rules were created in a complex notation. Thousands of rules leave open many possibilities for error. DMN is a graphical language for encoding rules that make a decision. It makes it much easier for business stakeholders to create the source code for their decision applications. They create a graph to encode complex business logic. Then they can incorporate a predictive model into the DMN diagram, and they can incorporate business rules as well, effectively combing the best of both options

Robotic Process Automation

Robotic Process Automation (RPA) is an exciting, fast moving space right now. With RPA, software robots are developed to perform routine, repetitive work that would otherwise be done by a human worker. The advantages of RPA are all about reducing cost and headcount by automating tasks.

Time spent copying information from a back office database into a spreadsheet, and moving data around from system to system – this is not the type of work where a human worker adds value to the process, but it still has to be done. RPA allows an organization to automate those repetitive tasks. 

One of the best advantages of RPA is that it enables you to automate work without having to change your underlying systems. You can keep your legacy mainframe and other applications exactly as they are. The robot pretends to be a human and carries out the same tasks exactly the same way a human worker would. 

But it is very important to consider RPA as just another software development approach. In future, as RPA evolves, I would expect to see containerized robots roaming around your hybrid cloud. But for now, RPA is just another app and it needs to be managed just like any other app. Version control and QA are very important. 

When you create a bot, beware of the attack surface. You create an opportunity for someone to add a few lines of script to that bot. Think about the damage that could be inflicted by an automated bot with high level access to all your enterprise data. That is a key reason why RPA should be subject to the same governance as any other software.

Also it is very important to understand that the value of RPA is in the ability to automate human work, not to patch holes in IT systems. If you find yourself building bots to fix holes in IT, you really need to take a good look at the infrastructure instead. It is not productive to use bots as band aids, because the bots themselves will continuously break as things change within the infrastructure. It is more productive to focus on fixing the source of the issue in the underlying infrastructure.

For RPA to remain relevant and continue to support software development, bots should be compatible with the cloud, and be able to run in containerized environments. This is technology we expect to see in the next few years or so.

The Red Hat Solution

Red Hat is very active in the software development space and offers a range of tools designed to solve the challenges associated with incorporating rules and decision making into business applications:

  • Red Hat Decision Manager is a platform for developing containerized microservices and applications that automate business decisions. Decision Manager includes business rules management, complex event processing, and support for building DMN models.
  • Red Hat Process Automation Manager is a platform for developing containerized microservices and applications that automate both business decisions and processes. Process Automation Manager includes business process management (BPM), business rules management (BRM), and business resource optimization and complex event processing (CEP) technologies. It also includes a user experience platform to create engaging user interfaces for process and decision services with minimal coding. 
  • For development in the Cloud, Red Hat OpenShift is an enterprise-ready Kubernetes container platform with full-stack automated operations to manage hybrid cloud and multicloud deployments 
  • Red Hat Runtimes is a set of products, tools, and components for developing and maintaining cloud-native applications. It offers lightweight runtimes and frameworks for highly-distributed cloud architectures, such as microservices. 

Robotic Process Automation and Cloud Technology – Challenges and Opportunities

The original article was published on IT Toolbox on July 23, 2019. 

RPA holds incredible promise for organizations looking to drive greater efficiency and cost savings; however, the industry must overcome several crucial challenges before it can truly live up to its potential. This article unpacks those challenges and explores the opportunities ahead.

Robotic Process Automation holds incredible promise for organizations looking to drive greater efficiency and cost savings; however, the industry must overcome several crucial challenges before it can truly live up to its potential. This article unpacks those challenges and explores the opportunities ahead.

By now, you’ve probably heard about Robotic Process Automation (RPA). It is not especially a new idea that’s suddenly gaining attention as businesses strive to become more digital. The promise of RPA is providing quick and significant cost savings through automation of human tasks with software robots. In fact, PwC estimates that “45% of work activities could be automated, and that this automation would save $2 trillion in global workforce costs.”

Challenges Faced by Organizations

Today, there are thousands of software robots automating everything from simple tasks like order entry and invoice preparation, to complex interactions, like issue resolution and customer service. But there are challenges awaiting many organizations, who have rushed to deploy robots.

1. Cloud Infrastructure Challenge:

First, there’s the matter of the cloud. Before RPA came along, those same organizations were busy planning multi-year efforts to reap the benefits of cloud computing. Moving IT to the cloud offers a similarly enticing cost benefit, but it is a long term project, requiring the deployment of new and emerging technologies.

Much has been invested in containers, orchestration, microservice and service mesh architectures, etc., as we lay the foundations for a serverless, data center-less future. However, RPA has some catching up to do. It is still confined to the desktop—the Windows desktop, to be precise.

The majority of software robots currently deployed are of the ‘attended’ type. This means that they exist on your Windows desktop, much like the little ‘Clippy’ assistant in bygone versions of Microsoft Office, where they do things like, move rows of data from a back office database to a spreadsheet, so that you can focus on more important things.

In the recent years, RPA has evolved to enable ‘unattended’ bots to manipulate your enterprise data behind the scenes, on Windows servers. That’s a step in the cloud direction, but still far from the notion of cloud-native bots that can cruise around your hybrid cloud and fix whatever needs fixing.

When will we see containerized bots, orchestrated by standard platforms like Kubernetes and Istio? Well, presumably not until RPA vendors realize the central role that Linux plays in modern cloud architectures. But more importantly, not until RPA goes open source. Why? Because open source software is the central pillar of modern cloud stacks, and if RPA is to have a role in hybrid cloud infrastructure, it must be open source too.

However, today, there is very little in the way of open source RPA. There are a few open source RPA-like projects such as, TagUI, Robot Framework and Sikulix, but these are very bare-bones compared to the market leading proprietary products in the market currently. The opportunity for these proprietary vendors to play in the hybrid cloud market is immense if they can embrace open source business models.

2. Cost Challenge:

The second challenge for users of RPA looking to save on labor costs is that today’s bots just aren’t all that smart. They don’t measure up to their human counterparts in their ability to figure out how to get the job done when some part of it turns out a little differently. Some bots are simple macros, repeating the same series of steps over and over. Others may have a little more intelligence, perhaps a rules engine to handle complex scenarios, but very few have anything close to actual intelligence.

The world of AI and ML is currently separate from RPA, and although some bots may be able to utilise AI services, like IBM Watson, none of them have the in-built ability to learn from past experience so they can do a better job the next time. Consequently, the anticipated cost savings don’t always materialize, and bots can be limited to highly structured and repetitive tasks. Just like with the cloud, though, there is opportunity here. I expect a marriage of RPA and AI/ML will likely happen soon, and will open up a new landscape of possibility for automated business.

3. Implementation Challenge:

Finally, there’s the implementation challenge—how to deploy RPA technology so that it supports your IT strategy rather than hobbles it. It’s easy to be tempted by RPA’s promise of a quick fix into attacking the symptoms of your problems rather than the root cause. Some organizations deploy bots as ‘band-aids’ to relieve bottlenecks in semi-automated processes, when the real problem is an ageing infrastructure that can’t accommodate new business requirements. This may solve the immediate problem, but will continuously break again with every minute change in operating processes, applications or infrastructure. Partly, it’s the ease with which an RPA bot can be deployed that’s to blame. Why go to the trouble of creating APIs for critical applications when it is easier to just have a bot screen-scrape, say, an accounts receivable app to get the one extra data field needed for the new invoices?

The answer, of course, is because this problem is just a symptom of a larger issue within the IT infrastructure, RPA does not fix a spaghetti tangle of applications, data and integration strategies. Organizations with this problem need to focus on building a cloud-native foundation first. Otherwise, if the team continues to throw bots at every new business request, the entire data center will eventually collapse from unmanageable complexity.

Automating Human Work

RPA is a valuable technology when it is used to automate human work, and not to patch holes in IT systems and applications. It is made more powerful when it can integrate effectively into a modern application environment—monitoring events, using cloud services to gather information and interacting with applications via APIs.

The opportunities for automation are huge, but the supporting IT infrastructure is critical. I believe those enterprises that are able to combine a modern cloud-native application environment with open source, intelligent, cloud-native bots will have an unparalleled competitive advantage.

Protecting your APIs if account take over (ATO) or DDoS keep you up at night

Red Hat partner Imperva recently announced an expanded portfolio of security products targeting protection against account takeover (ATO) and attacks targeted at APIs, as well as a three-second SLA for DDoS attack protection.

These new services provide comprehensive protection against a wide range of cybersecurity threats targeting not just websites, but also business-critical APIs, legacy systems, and other applications.

Read Imperva’s blog where they highlight that while APIs play a critical role in accelerating innovation and business growth in the digital economy, they need to be protected from cybercriminals trying attacks like intrusion attempts.

Also, don’t miss their press release where our director of product management, Mark Cheshire, weighs in on how Imperva’s solutions complement Red Hat 3scale API Management for a better, safer experience.

 

Red Hat Integration Customers Showcase Their Accomplishments at 2019 Red Hat Summit

We just wrapped up Red Hat Summit and it was a great time! Thank you for those of you who were able to attend and we hope to see those who couldn’t in San Francisco next year!

Customers using Red Hat Integration capabilities were recognized and showcased their relationships with Red Hat. BP and Emirates NBD were honored as Red Hat Innovation Award winners for their technological achievements and demonstrated creative thinking, determined problem-solving and transformative uses of Red Hat technology. Ally Financial and Banco Galicia received honorable mentions for the Innovation Awards as well. In addition to the Innovation Award winners and honorable mentions, multiple Red Hat Integration customers spoke about their respective paths to success. These customers included the Government of Canada, Canadian Imperial Bank of Commerce (CIBC) Bank, Spark New Zealand, Deutsche Telekom IT and Society for Worldwide Interbank Financial Telecommunication (SWIFT).

BP was selected as an Innovation Award winner for modernizing their complex technology infrastructure by building a self-service platform and a DevOps culture. BP utilized Red Hat OpenShift Container Platform, and other Red Hat technologies, to build their Application Engineering Services’ Digital Conveyor. The Digital Conveyor empowers product delivery teams with self-service capabilities, a DevOps approach and a continuous integration/continuous delivery pipeline.

Emirates NBD was named an Innovation Award winner for its implementation of a cloud-native platform and agile methods to offer customers personalized, always-on digital banking. Emirates NBD established their own scalable private cloud platform, Sahab (White Cloud), with the flexibility to accommodate a future hybrid cloud model. Emirates NBD built Sahab on Red Hat OpenShift Container Platform and employs other Red Hat technologies like Red Hat 3scale API Management, Red Hat Fuse and Red Hat AMQ. With the new platform, Emirates NBD is able to manage over 500 APIs and is able to more easily integrate and collaborate with third parties including FinTechs, government institutions and technology partners.

Ally Financial was recognized as an Innovation Award honorable mention for creating a containerized hybrid cloud platform to support cloud-native application development and adopting DevOps practices to increase collaboration, innovation and efficiency. Ally Financial needed to be able to develop and deploy releases faster and with greater reliability to satisfy their customers’ needs for speed and efficiency. Ally Financial used a range of Red Hat technologies and services including Red Hat Fuse, Red Hat OpenShift Container Platform, Red Hat OpenShift Application Services and Red Hat Consulting to develop the new platform and adopt practices to improve time-to-market, increase agility and strengthen its competitive advantage.

Banco Galicia was chosen as an Innovation Award honorable mention for streamlining its digital services by migrating its channels and back-end systems to a unified, cloud-native, omnichannel platform. This platform supports collaborative agile development, provides cross-environment workloads and offers more secure integration with existing banking systems. Banco Galicia has taken advantage of Red Hat Integration products including Red Hat 3scale API Management, Red Hat Fuse,  Red Hat AMQ and Red Hat Data Grid to simplify digital services. This simplification has produced an improvement in customer experience while seeing a decrease in application downtime and estimating a decrease in development costs.

The Government of Canada examined overcoming API challenges by building a central API hub for government departments to publish and monitor their APIs using Red Hat 3scale API Management and Red Hat OpenShift Container Platform.

CIBC Bank described how they built an optimal architecture for open banking by using service mesh and API gateways. They also laid out how to balance creativity and standardization while building a culture focused on speed and innovation.

Spark New Zealand discussed how they navigated a company-wide journey of agile transformation and migrating from a legacy, proprietary integration platform to a container-based, open source, agile integration solution utilizing Red Hat AMQ, Red Hat Fuse and Red Hat OpenShift Container Platform.

Deutsche Telekom IT spoke about harnessing agile integration to empower internal development teams to deliver services faster in a consistent and standardized way by building a next-generation integration platform using Red Hat Fuse and Red Hat OpenShift Container Platform.

SWIFT chronicled how they integrated customers’ back office functions with SWIFT functions to help ease customer pain points. SWIFT chose Red Hat Fuse, Red Hat AMQ and Red Hat OpenShift Container Platform to be embedded in the main messaging client application which hundreds of financial institutions use to exchange millions of financial messages each day across the SWIFT network.

This is just a small sample of the successes customers across industries are having with Red Hat Integration. Agile integration and the application environment are having profound impacts on institutions around the world. We at Red Hat are standing by to help you make the next (or first) step in your digital transformation.

Learn more:

An Application Environment Powered by Integration

We have been doing application and data integration for 20+ years. Why? Siloed applications and data need to be connected to synchronize data between applications, complete and automate business processes and improve efficiencies. Technologies like the enterprise service bus (ESB), managed file transfer (MFT), message brokers, integration platform as a service (iPaaS) have been used to solve integration requirements. Today as organizations pursue competitive advantage and differentiation through software and adopt digital services, integration has become a strategic capability. Integration is pervasive and critical for digital business as it connects new applications, existing applications, data and devices to create innovative, differentiated solutions. Evolving customer engagement models, new customer experiences, and competitive service delivery models are powered by integration.

Integration in application environment

Key transformations in technology, software architecture and software delivery have elevated  the role and importance of integration in application development and to the rearchitecting of the old monolithic integration solutions into a set of middleware services that developers can easily incorporate into applications. In the era of digital services, applications are distributed, spanning infrastructures and cloud environments. Application creators need to create quickly and frequently iterate. They need tools and application infrastructure that enables them with this increased agility. The success (or failure) of a new service or application depends on its ability to communicate with other services, across any infrastructure, in scalable secure ways. Application components must be composable, must be deployable across cloud and hybrid environments and must interoperate to deliver value. Applications must be able to connect to existing data, new SaaS apps, custom apps, IoT data and more. Integration is essential to realizing these architectures. This world is your application environment.  The context of systems, services, applications, infrastructures, clouds where your software innovation happens. Integration has become a central part and key enabler of application creation. A first-order consideration in building solutions.

Cloud-native integration

As application developers adopt microservices style architectures and DevOps practices to create rapid, incremental innovations, container-based infrastructure is key to deliver these services. Cloud-native applications take full advantage of the container platform to create scalable, resilient and iterative services. An application environment requires a common set of standards, practices, capabilities that are “engineered together” to develop, test and deploy cloud-native applications. In a robust application environment, cloud-native integration capabilities, are container-based, API-centric, distributable, and composable as microservices. Application developers can easily include the integration capabilities as part of their toolsets and practices. This allows application developers, integration developers, and citizen integrators to all participate in delivering adaptive and innovative services.

Roughly two years ago we introduced the concept of agile integration which is based on 3 key capabilities distributed integration, APIs and containers.  The agile Integration technologies and architectural approach are foundational to realizing an application environment that meets the needs of modern software-driven organizations.

Learn More

What is an application environment? Check out this blog by Middleware VP and GM, Mike Piech.

 

Business Process Management Reimagined – New Services for Application Developers

You may be familiar with Business Process Management (BPM).  It is a discipline in which people use various methods to discover, model, analyze, measure, improve, optimize, and automate business processes. Today BPM, or more specifically the technology that supports BPM, is widely used in organizations large and small to automate business operations. The story of BPM is long, with roots going back to the early 1990’s, and it has constantly reinvented itself to meet the evolving needs of enterprises.  Once focused on driving efficiencies into back-office functions, BPM platforms have evolved into essential tools for enterprises looking to digitally transform operations, and to deliver a personalized customer experience that’s integrated across points of interaction.  

BPM in the Application Environment

The focus on digital transformation has led to the modern role of BPM solutions in application development, and to the rearchitecting of the old monolithic BPMS as a set of middleware services that developers can easily incorporate into applications.  Now often referred to as a Digital Application Platform (DAP), the BPMS has become part of the application environment – a catalog of components that can be included in applications requiring process management, decision management or optimization capabilities.  Now, for example, when building an application that requires, say, to make a determination of whether an insurance application complies with underwriting rules, a developer can quickly locate the corresponding decision service within their app environment and include it in their application.  Conversely, it’s the new DAP solutions that enable such application services to be quickly created from models provided by the business. Business friendly tools support the creation of a range of model types – from decision models, built with the new graphical Decision Model & Notation (DMN) standard, to models of entire business processes constructed in Business Process Model & Notation (BPMN).  DAP technology today is truly making it possible for business users to contribute to application development alongside developers.

Cloud-Native Digital Automation

The advantages of modern digital automation middleware are not limited to application development, however.  Once built, those new applications must be deployed on a variety of cloud platforms, they must scale automatically to meet varying demands, they must be secure, and they must be easy to replace or upgrade without impacting the user experience.  At Red Hat, the application environment, in which DAP services are included, is designed from the ground up to be cloud-native. DAP services are deployed in containers, and managed by Kubernetes, to provide the scalability and resiliency that enterprises need.

Digital automation today is an essential part of modern applications, and its importance is only likely to increase as we look to the future role of the Digital Automation Platform as the logical home for emerging technologies like Robotic Process Automation (RPA), and artificial intelligence / machine learning.  At Red Hat, we are focused on growing our application environment to support this widening technology landscape, so that our customers can succeed in an ever more digital world.

Red Hat JBoss Enterprise Application Platform 7.2 Availability

The release of Red Hat JBoss Enterprise Application Platform 7.2 (JBoss EAP), Red Hat’s flagship middleware offering for enterprise Java, is now available.  Organizations around the globe trust and rely on JBoss EAP, a Java-EE compliant application server, to run their production workloads in on-premise, virtualized, containerized, and private, public, and hybrid cloud environments. With this release, Red Hat reaffirms its continued commitment to Java EE 8 as well as Jakarta EE, the new home for cloud-native Java, a community-driven specification under the Eclipse Foundation.

Continue reading “Red Hat JBoss Enterprise Application Platform 7.2 Availability”

Eclipse Glassfish 5.1 RC-1 Release

Today the first candidate release of the Eclipse Glassfish server targeting the new Jakarta EE 8 release is available. This was a huge effort to move the Glassfish source repositories over to the Eclipse github organization. It was accompanied by the move of TCK project as well. You can read about the details in Dmitry Kornilov’s blog here.

Red Hat’s Support of Jakarta EE

Red Hat is committed to supporting the evolution of enterprise Java at Eclipse and has been focusing on development of the Eclipse Jakarta EE specification process as well as helping with getting the migrated projects and TCK projects running under the eclipse CI infrastructure.

The new specification process is a replacement for the Java Community Process (JCP) used to develop the Java EE specification through Java EE 8. It provides a fully open source based process that includes specifications, APIs and TCKs. The Eclipse Jakarta EE specification process will be used to develop the next generation of the EE4J specifications. Mike Milinkovich has written about the current process draft status in detail here. The initial draft is in public review, so I recommend you take the opportunity to browse through it and make comments on the draft document provided in Mike’s blog post.

Red Hat at VoxxedDays Microservices

 

Microservices are no longer a playground and developers and architects have adopted them and re-architected applications to reap their benefits, while others have also deployed to production. Voxxed Days Microservices is an exciting place to share those experiences and also listen to what Red Hat has been doing in this space with Microprofile, Thorntail, SmallRye and more. As you might know, Voxxed Days Microservices 2018 will take place in Paris on the 29th October and Red Hat will be there together with the Eclipse Foundation. We also have an exciting line up of sessions and also a presence at the booth for any interesting conversations, questions and discussions on the topic.

 

Here is a list of sessions:

 

Session Title Date and Time (CET)   
Keynote: Distant past of Microservices Monday 09:15
Ask the Architects Monday 18:15
Data in a Microservices world: from conundrum to options Monday 14:30
Data Streaming for Microservices using Debezium Tuesday 14:30
Thorntail – A Micro Implementation of Eclipse MicroProfile Tuesday 11:15
What is SmallRye and how can you help? Tuesday 13:00

 

There will also be a full day hands-on workshop on Wednesday, October 31st for those interested in learning more about Microprofile.

 

Come by the Eclipse MicroProfile booth and pick up some swag!

 

For more information:

Voxxed Days Microservices conference program at-a-glance

The path to cloud-native applications: 8 steps

Red Hat OpenShift Application Runtimes

Understanding cloud-native apps

Understanding middleware

Thorntail.io

MicroProfile.io