Introducing Technical Reports

RHAMT 4.1.0 Is Released!

The 4.1.0 version of the Red Hat Application Migration Toolkit has recently been released, and with it is a new report that I’d like to highlight.

If you’re not familiar with RHAMT, check out my previous article introducing the product.

Technology Reports

One of the RHAMT features mentioned at this year’s summit were the Technology Reports.

This report provides an aggregate listing of the technologies used, grouped by function, for the analyzed applications. It shows how the technologies are distributed, and hundreds of applications can be simultaneously compared after analysis has been performed. In addition, the size, number of libraries, and story point totals of each application are displayed, allowing you to quickly determine each application’s type from a single report.

Examining application_13 in the above list, we can see that this is likely a secured frontend application with a cache for performance. It contains several libraries, most of which pertain to security in some form.

Each application can be further examined to identify the technologies in use. For instance, drilling down into application_13 shows:

Here we can see that the precise libraries in use within each category. As previously noted, this application uses a large number of security libraries, and we can precisely identify the technologies in use.

Regardless of how you use the technology reports, I’m certain it will be useful in your migration and modernization efforts.

Effective Case Management within a BPM Framework

In real life, organizations have workflows which may not fit into prescribed, sequential process path or which require human intervention or approval before the entire process can be completed. Within the business process world, more unstructured and unpredictable work is handled through case management rather than process management.

There are slightly different standards defined for case management and process management, which reflect the differences in the types of process flows and data being handled in each type of model.

But the question for business architects is which standard to use or whether to try to balance both — and then for developers to try to create models on different or shared development platforms.

A Quick Comparison of CMMN and BPM for Development Standards

First, it may be helpful to explain why there is a difference between business process management and case management. Both models are defined by two separate specifications, Business Process Model and Notation (BPMN) and Case Model and Notation (CMMN), respectively.

Continue reading “Effective Case Management within a BPM Framework”

Process management and business logic for responsive cloud-native applications: Red Hat Process Automation Manager is released

Today, Red Hat announced the latest major release of its business process suite, with a new name and several major changes that pivot the focus of the product itself. Red Hat Process Automation Manager is about more than providing a business process modeler or optimizing resource allocation. This is the first generation (at Red Hat) of a digital automation platform — a hub where business users and technical developers can collaborate to create strategically-relevant, intelligent applications.

Red Hat Process Automation Manager has two core conceptual areas:

  • The first is based on decision management (the “intelligent” part of intelligent or even-driven applications). This includes the decision engine of Red Hat Decision Manager and allows automated, immediate responses to interactions, from event processing to resource optimization.
  • Second, Process Automation Manager provides the means of modeling and applying business logic within an application. In combination with a graphical UI, these creates a platform for business users to be able to design business logic in collaboration with the technical teams.

New feature: Process management + case management

The heart of a BPM platform is the “BP” — business process modeling. The previous BPM Suite supported BPMN, the notation specification for business process models, and DMN, the notation specification for data models. The assumption behind a lot of these specs is that the workflows or processes being modeled are relatively static or sequential. For certain types of business processes, that is an accurate assumption (things like resource optimization or scheduling). However, in many organizations, there are also processes which are not linear or which may follow different steps in a dynamic sequence or may be interrupted or require human intervention at certain points. These are generally defined within a related notation specification, Case Management Model Notation (CMMN).

While there are differences, there is also a lot of conceptual overlap between business processes / BPMN and case management processes / CMMN. Process Automation Manager combines the functionality of both process models and case management models within a single digital automation platform. (This is covered in more detail in the blog post here.)

Supporting both linear process / task models and dynamic or unpredictable case management models within the same platform allows developers to have a simpler development process (and, combined with other features like Process Automation Manager’s new graphical UI, makes collaboration with business users easier).

Process Automation Manager also supports other types of modeling and visualizing data and worflows:

  • Data modeling
  • Decision modeling
  • Custom data dashboards
  • Process simulations

New Feature: An easier way for business users to collaborate (graphical UI)

Previous versions of Red Hat JBoss BPM Suite were designed around business process logic, but were intended to be used by Java developers within the application development process. Beginning with this Process Automation Manager 7.0 release, there is a new Entando UI included with the platform. This provides an easier, graphical interface where business users can just drag and drop elements into their models — using ultimately the same platform that the developers are using to create the application. Business processes, rules, and logic can be written into the application essentially without having to write a single line of code.

This also effectively changes the workflow for creating event- and process-driven application. Previously, developers did all the work within their development environment. Now, business users can work in parallel (using the Process Automation Manager UI) to create artifacts which can be pulled into the developer’s IDE and code. Everything can then be packaged up and deployed in containers or other environments.

New feature: Cloud (and container) native applications

With more distributed, hybrid infrastructures, it is imperative that applications be able to function exactly the same regardless of the underlying platform. And those applications need to be designed, natively, to work in a distributed, dynamic environment so that they can be rapidly deployed, updated, or scaled.

Process Automation Manager can itself run in Red Hat OpenShift containers, in public or private clouds, on-premise, or in all environments — depending on the needs of your development and infrastructure teams. Additionally, the models and applications created using Process Automation Manager as a platform can be deployed into cloud instances, OpenShift containers, or local instances. This allows truly hybrid development, testing, and production environments.

Process Automation Manager components, applications, and models can all be exposed and accessed using REST APIs, allowing integration with other software applications or management tools.

Additional Resources

  • Dive a little deeper into process automation technology with our tech overview.
  • For general information about the Process Automation Manager, check out the datasheet.
  • There are different use cases for process automation and a business decision engine. The FAQ runs through some things to consider.
  • Get started by actually using the Process Automation Manager. Red Hat Developers has a whole “hello world” example, waiting for you.

Meet application integration in the times of hybrid cloud

The concept of agile integration, depending on whom you ask, may appear as a contradiction in terms. Integration is a concept that used to be associated with “slow,” “monolithic,” “only to be touched by the expert team,” etc.. Big and complex legacy enterprise service buses connected to your applications were the technology of choice at a time when agility was not a requirement, when the cloud was barely an idea, when containers were associated with maritime shipping and not with application packaging and delivery.

Can the principles of agile development be combined with those of modern integration? Our response is yes, and we call it  agile integration. Let me show you what it is, why it is important, and what we at Red Hat are doing about it.

Software development methodologies have evolved rapidly in the last few years to incorporate innovative concepts that result in faster development cycles, agility to react to changes and immediate business value. Development now takes place in small teams, changes can be approved and incorporated fast to keep track of the changing demands of the business, and each iteration of the code has a product as the ultimate result. No more need for longer development cycles and never-ending approvals for changes. And importantly, business and technical users join forces and collaborate to optimize the end result.

In addition, modern integration requires agility, cloud-readiness, and support of modern integration approaches. In contrast with the legacy, monolithic ESBs, modern integration is lightweight, pattern-based, scalable, and able to manage complex, distributed environments. It has to be cloud-ready and support modern architectures and deployment models like containers. It also has to provide integration services with new, popular technologies, like API management, which is becoming the preferred way to integrate applications and is at the core of microservices architectures. And support innovative and fast evolving use cases such as the Internet of Things (IoT).

Continue reading “Meet application integration in the times of hybrid cloud”

Red Hat present at EclipseCon France 2018

EclipseCon France is taking place this week in Toulouse, France (June 13-14, 2018) and it’s offering a great lineup of top-notch sessions on nine different tracks, from IoT to cloud and modeling technologies. This year, there is even a dedicated track for “Microservices, MicroProfile, EE4J and Jakarta EE,” which is covering topics such as Istio, 12-factor apps, geoscience, machine learning, noSQL database integration, cloud-native application development, security, resilience, scalability, and the latest statuses of the Jakarta EE and MicroProfile open source specification projects. Under this track, we are hosting two sessions:

But we are also delivering other interesting sessions under the “Reactive Programming” track:

Under the “IoT” track:

Under the “Eclipse IDE and RCP in Practice” track:

And, under the “Cloud & DevOps” and “Other Cool Stuff” tracks:

For those of you that will be at the conference, we invite you to attend the sessions above and to stop by the Red Hat booth to learn how Red Hat can help your organization solve your IT challenges (and get your swag too!). And for those of you that would like to learn more about Red Hat offerings in relation to the topics above, please visit the following links:

Digital Automation Platforms: Injecting speed into application development

Red Hat has just published a new study by Carl Lehmann of the 451 Group, “Intelligent Process Automation and the Emergence of Digital Automation Platforms,” that examines the increasing importance of business automation technologies in modern business, and the ways that converged solutions (digital automation platforms) are bringing value to organizations engaged in digital transformation projects.

Carl writes that competitive advantage is enabled when an organization either does the same things as its rivals, but differently, or it does different things that are acknowledged as superior by customers. In today’s competitive markets, businesses are turning to next-generation digital automation platforms (DAP) to enable greater automation of key business functions and greater flexibility in responding to their customers’ needs.

A DAP is a set of tools and resources structured within a uniform framework to enable developers to rapidly design, prototype, develop, deploy, manage, and monitor process-oriented applications – from simple task-related workflows to dynamic unstructured collaborative activity streams and even highly structured cross-functional enterprise applications. To do so, DAPs are equipped with a range of new capabilities that go beyond those of their BPM and application development predecessors.

Continue reading “Digital Automation Platforms: Injecting speed into application development”

Announcing: Red Hat Fuse 7 is now available

After several technical previews over the last few months, Red Hat Fuse is officially available. This is a significant release, both for Fuse itself and for integration platforms, because it represents a shift from more traditional, on-premise, centralized integration architecture to distributed, hybrid environment integration architecture.

Integration itself has historically been a bottleneck for infrastructure design and changes. The integration points were largely centralized and controlled by a central team in an attempt to manage dependencies and standardize data management between applications. However, that centralization also made change difficult, and it was governed more by procedure and bureaucracy than business innovation. As with traditional infrastructure architecture more generally, integration has not historically been an agile or adaptive architecture.

Red Hat Fuse (and related community projects) is the beginning of a departure from traditional, rigid integration platforms to more agile, distributed integration design. Fuse introduces three major features in the latest release:

  • Fuse Online, fully hosted Fuse applications and integrations. Fuse Online provides immediate access to the functionality of Fuse, without having to install and configure it on-premise. Developers can begin testing and customizing integrations immediately. Connectors can be uploaded to the online development area to allow even more integrations.
  • Fuse container images for Red Hat OpenShift. Fuse runs natively on OpenShift, allowing local, containerized integration points to be created in development teams and to be designed, tested, and updated within DevOps workflows as part of the overall application development cycle.
  • A drag-and-drop UI for integration pattern design. While integration development is typically done within IT teams, integration design relies on business knowledge. Business managers and analysts need to be able to collaborate effectively with their development teams. The new Fuse Ignite UI (based on the Syndesis.io project) is a lowcode way to develop integration — business users can use design elements to create integration architectures and to work with their development teams, within the same tool set.

These three features allow more agile integration development. Fuse installations can span online, on-premise, or container based environments without reducing functionality. This allows an integration platform that crosses environments, and be as lightweight and decentralized as an individual development team or an enterprise-wide platform. The lowcode UI allows business users to be brought directly into the application development cycle, enabling business logic to be incorporated into the integration application design from the beginning.

Additionally, Fuse 7 contains these new features:

  • Support for Spring Boot deployment for Fuse applications
  • 50 new application connectors (with a total of over 200 included connectors)
  • A new monitoring subsystem
  • Updated component versions, including new versions of Red Hat JBoss Enterprise Application Platform and Apache Camel
  • A new name (Red Hat Fuse, rather than Red Hat JBoss Fuse)

 

Additional Resources

An Introduction to Red Hat Application Migration Toolkit

Application migration and modernization can be a daunting task. Not only do you have to update legacy applications with newer libraries and APIs, but often must also address new frameworks, infrastructures, and architectures all while simultaneously keeping resources dedicated to new features and versions.

Red Hat Application Migration Toolkit (RHAMT), formerly known as Windup, provides a set of utilities for easing this process. Applications can be analyzed through a command-line interface (CLI), a web-based interface, or directly inside Eclipse, allowing immediate modification of the source code.

These utilities allow you to quickly gain insights into thousands of your applications simultaneously. They identify migration challenges, code or dependencies shared between applications, and accelerate making the necessary code changes to have your applications run in the latest middleware platforms.

Choosing the Right Distribution

You’ve read the introduction, possibly seen a video, and are eager to run your first application through the process. Where do you begin?

RHAMT provides a number of different distributions to meet your needs, and all include detailed reports that highlight migration issues with effort estimation. Each of these is summarized below.

CLI

CLI DownloadProduct Documentation

The CLI is a command-line tool that provides access to the reports without the overhead of the other tools. It includes a wide array of customization options, and allows you to finely tune the RHAMT analysis options or integrate with external automation tools.

Web Console

Web Console DownloadProduct Documentation

The web console is a web-based system that allows a team of users to assess and prioritize migration and modernization efforts. In addition, applications can be grouped into projects for analysis.

Eclipse Plug-in

Eclipse Plug-in DownloadProduct Documentation

The Eclipse plug-in provides assistance directly in Eclipse and Red Hat JBoss Developer Studio (JBDS) and allows developers to see migration issues directly in the source code. The Eclipse plug-in also provides guidance on resolving issues and offers automatic code replacement where possible.

Start by Choosing a Distribution

  • If you’re working on a team that needs concurrent access to the reports, or have a large number of applications to analyze, then choose the web console.
  • If you’re a developer familiar with Eclipse or JBDS and want live feedback, then start with the Eclipse plug-in.
  • Otherwise, we recommend starting with the CLI.

Follow the download link for the chosen distribution, and then examine the first few chapters in the appropriate guide to install and run the tool.

Analyzing an Application

You have a local installation of RHAMT, located at RHAMT_HOME, and an application you want to analyze. For the purposes of this blog, we’ll assume that you chose the CLI. With that out of the way, let’s get started.

The analysis is performed by calling `rhamt-cli` and passing it in the application along with any desired options, as seen in the following example.

$ bin/rhamt-cli --sourceMode --input /path/to/source_folder/ --output /path/to/output_folder/ --target eap7

The options are straightforward:

  • –sourceMode – indicates the input files are source files instead of compiled binaries
  • –input – path to the file or directory containing the files to be analyzed
  • –output – path to the directory to contain the reports
  • –target – technology to migrate to; used to determine the rules for the analysis

Once the analysis finishes, a message will be seen in the console indicating the path to the report.

Report created: /path/to/output_folder/index.html
Access it at this URL: file:///path/to/output_folder/index.html

Rules

All of RHAMT’s distributions utilize the same rules engine to analyze the APIs, technologies, and architectures used by the application you plan to migrate. This engine extracts files from archives, decompiles classes, scans and classifies file types, analyzes XML and other file content, analyzes application code, and then generates the reports.

Each of these actions is handled by defined rules, which consist of a set of actions to perform once conditions are met. We’ll look more in-depth at how rules work, and creating your own custom rules, in a subsequent post, but for now know that RHAMT includes a comprehensive set of standard migration rules to get you started.

Just Lifting and Shifting?

Lifting and shifting, or rehosting, an application is one possible first step in migrating it. This process involves moving the application onto a different target runtime or infrastructure. A common end goal of this stage is to make the smallest number of changes to have the application running successfully in a cloud environment.

Once the application is successfully running in the cloud, the next step is to modernize the application so that it’s natively designed for a cloud environment. Instead of simply rehosting the application, this step involves redesigning it, moving unnecessary dependencies and libraries outside the application.

Regardless of which step you’re at, RHAMT assists with both of these steps by providing a set of cloud-ready rules. Once executed against the application, a detailed report is created that indicates what changes should be made. For anyone familiar with using RHAMT to migrate middleware platforms, the process is similar – examine the report and adjust your application based on the feedback.

It’s that simple.

Summary

Wherever you are in the migration process, I’d recommend looking at RHAMT. It’s extremely simple to set up, and comes with a number of default rules to assist in any part of the migration and modernization process. In addition, RHAMT facilitates solving unique problems once – after a given solution has been identified a custom rule can be created to capture that solution, vastly simplifying the migration process.

Stay tuned for our next update, where we discuss how to create custom rules to better utilize RHAMT in your environment.

References

https://developers.redhat.com/products/rhamt/overview/

https://access.redhat.com/documentation/en-us/red_hat_application_migration_toolkit/

Announcing AMQ Streams: Apache Kafka on OpenShift

Cross-posted from the Developers Blog. See the session at Red Hat Summit on Apache Kafka and AMQ data streams on Thursday, May 10, at 11:15.

We are excited to announce a Developer Preview of Red Hat AMQ Streams, a new addition to Red Hat AMQ, focused on running Apache Kafka on OpenShift.

Apache Kafka is a leading real-time, distributed messaging platform for building data pipelines and streaming applications.

Using Kafka, applications can:

  • Publish and subscribe to streams of records.
  • Store streams of records.
  • Process records as they occur.

Kafka makes all of this is possible while being fast, horizontally scalable and fault tolerant. This makes Kafka suitable for a large range of use cases, including website activity tracking, metrics and log aggregation, stream processing, event sourcing, and IoT telemetry. The forthcoming AMQ Streams product will provide Red Hat customers with a supported offering for running Apache Kafka on Red Hat Enterprise Linux and on Red Hat OpenShift Container Platform.

As more and more applications move to Kubernetes and OpenShift, it is increasingly important to be able to run the communication infrastructure on the same platform. OpenShift as a highly scalable platform is a natural fit for messaging technologies such as Kafka. But with AMQ Streams, our target is not to just run Apache Kafka on OpenShift, but rather AMQ Streams makes running and managing Apache Kafka “OpenShift native.”

Uniting the massive scalability of Kafka on an elastic platform like OpenShift involves resolving a number of technology challenges:

  • Kafka brokers are inherently stateful, because each has its own identity and data logs that must be preserved in the event of restarts.
  • Updating and scaling a Kafka cluster requires careful orchestration to ensure that messaging clients are unaffected and no records are lost.
  • By design, Kafka clients connect to all the brokers in the cluster. This is part of what gives Kafka its horizontal scaling and high availability, but when running on OpenShift, this means the Kafka cluster cannot simply be put behind a load-balanced service like other services. Instead services have to be orchestrated in parallel with cluster scaling.
  • Running Kafka also requires running a Zookeeper cluster, which has many of the same challenges as running the Kafka cluster.

AMQ Streams simplifies the deployment, configuration, management and use of Apache Kafka on OpenShift using the Operator concept, thereby enabling the inherent benefits of OpenShift, such as elastic scaling. An Operator is an application-specific controller that extends the Kubernetes APIs and combines them with domain-specific knowledge and makes it easy to run and manage complex applications.  Developers and administrators used to OpenShift’s declarative approach to resource provisioning can now enjoy those same benefits when working with Kafka, Kafka Connect, and Kafka topics.

AMQ Streams makes it easy to:

  • Deploy a complete Kafka cluster, at the scale that suits you, with the click of a button or with a single oc create command.
  • Deploy the Kafka topic right alongside the microservice that uses it.
  • Scale up the partitions of that topic.
  • Trivially scale up and and down the Kafka cluster according to load.

Strimzi Logo

AMQ Streams is optimized for running on OpenShift (as opposed to regular Kubernetes). Not only does it benefit from Red Hat’s years of experience and in-depth knowledge gained from developing and running OpenShift, but there is, for example, special support for building Kafka Connect clusters with the user’s own Kafka Connect plugins. Further, OpenShift-specific features and Red Hat product integrations are anticipated, with the overall aim being a seamless experience with full support from the OpenShift fabric up.

Unsurprisingly, because Red hat is the world’s leading provider of open source technologies for the enterprise, AMQ Streams is fully open source and based on the Strimzi project.

The Developer Preview, which is being made available to interested customers this week, provides the foundation for running Kafka on OpenShift. Interested customers and other interested parties are invited try it out, give us their feedback and, if desired, collaborate on the open source Strimzi project to help shape the future direction of AMQ Streams on OpenShift.

If you’re lucky enough to be in San Francisco this week for Red Hat Summit, then you can hear a lot more about AMQ Streams (and the broader Red hat AMQ product) at the following sessions:

  • Running data-streaming applications with Kafka on OpenShift
    Tue May 8, 1:00 PM–3:00 PM, Moscone South 156
    Marius Bogoevici, Paolo Patierno, Gunnar Morling [L1099]
  • Red Hat AMQ overview and roadmap
    Wed May 9, 11:45 AM–12:30 PM, Moscone West 2011
    David Ingham, Jack Britton [S2802]
  • Introducing AMQ Streams—data streaming with Apache Kafka
    Thu May 10, 11:15 AM–12:00 PM, Moscone West 2014
    Paolo Patierno, David Ingham [S1775]
  • Red Hat AMQ Online—Messaging-as-a-Service
    Thu May 10, 1:45 PM–3:45 PM, Moscone South 214
    Ulf Lilleengen, Paolo Patierno [W1098]

We expect to release further previews as we iterate towards the general availability release, which is planned for later this year.

Please give it a try and let us know what you think.

 

Why are our Application Platform Partners succeeding in Digital Transformation?

Last year we set out to start the Application Platform Partner Initiative with the objective to enable deeper collaboration with partners focused on application platform and emerging technologies. We planned to create a collaborative go-to-market strategy between Red Hat and participating partner organizations focused on optimizing the value chain for application development and integration projects.

The Application Platform Partner Initiative focuses on Application Development-related and other emerging technology offerings, which revenue increased 42% in our last fiscal year up to $624 million. Partners like the APPs are contributing to this growth and we are happy to see the momentum continuing, and their trust on Red Hat as a strategic partner. What started out as a pilot has developed into a fully fledged initiative with 28 partners across North America, who are as committed as we are to the role opens source plays at the core of digital transformation.

As part of the success of this initiative, for the first time this year, we have created the Application Platform Partner Pavilion in Red Hat Summit.  Arctiq, Crossvale, Kovarus, Levvel, Li9, Lighthouse, OSI, Shadow-Soft, VeriStor and Vizuri will join us this year in the pavilion. Don’t miss a chance to get to know the advanced solutions they have created on top of Openshift and Red Hat Middleware products, which they will be showcasing at Red Hat Summit. Check out, for example, Arctiq Value Stream Mapping (VSM), Crossvale CloudBalancer for Red Hat® OpenShift or Vizuri log aggregation solutions.

These partners are delivering a strong investment in enablement, and commitment in their go-to-market alliance with Red Hat, including co-marketing and sales collaboration. As some examples of planned activities, Arctiq is running a Modern Mobile App Development event and Crossvale an OpenShift roadshow).

Levvel has been an active participant in the APP program, doing joint webinars, customer workshops and panel discussions to promote Red Hat emerging technologies. As a result, they have influenced and closed quite a few customers and have a long list of potential opportunities. Don’t forget to attend their coming up event “App Transformation Workshop: Monoliths to Microservices”!

Shadow-Soft has been particularly focused on growing the customer base with our OpenShift and JBoss product family with innovative sales and marketing strategies that are turning into a growing pipeline of opportunities, and running events around digital transformation.

Veristor joined recently the APP program and is growing rapidly their different practices around OpenShift and Red Hat Middleware, like DevOps and Agile Consulting, Services and Software Development practice.

OSI, an international company with a long experience with JBoss, is also growing in the US and have worked on an Agile Integration demo environment focusing on JBoss Fuse Integration platform to support their customer engagements, including integration with cloud and on-premise systems. Try to attend their “Monoliths to Microservices: App Transformation Workshop” right after Summit.

Vizuri has been a Red Hat partner for over 10 years. Having delivered more than 120 JBoss-related engagements, their JBoss experience and expertise helps customers reduce risk and improve time-to-value, while avoiding project delays and unplanned downtime. You can’t miss their take on How To Manage Business Rules In A Microservices Architecture using OpenShift and JBoss BRMS.

Having recently joined the APP program, Astellent has heavily invested in enablement and marketing, while achieving exciting customer success. Read their views on the newly launched Red Hat Decision Manager 7.

Lighthouse has been helping businesses with the right mix of Red Hat’s public, on-premises, and hybrid cloud technologies, customizing them to fit their unique business needs. They have also been active with unique marketing events like the one with the Red Sox coming in May.

As you can see, APP partners are working closely with Red Hat to establish a sales, marketing, and delivery practice around Red Hat technologies, including Red Hat JBoss Middleware, Red Hat OpenShift, and Red Hat Mobile Application Platform.

In the words of John Bleuer, VP, Strategic Partners, North America, “I am thrilled that as year one of the program ends, the sophistication of our partner solutioning and delivery abilities has increased dramatically; many partners are working with us in industry and line of business (including healthcare, payments, and e-commerce); other partners are adding sophistication into the DevOps / automation practices with Openshift, Jenkins, and Ansible, while others are honing their skills delivering app modernization and integration & BPM solutions in a cloud native environment, containerized in OpenShift.  It’s an exciting time at Red Hat”.

The market is looking to digital transformation initiatives to grow and maintain competitive advantage. Challenges range from confined platforms to complex architectures, from rigid processes to lack of agility. Together with our partners, we can play a critical role to help our customers overcome those to become growing, competitive organizations.

We hope to see you at Red Hat Summit checking them out, as well as at the Red Hat Summit Ecosystem Expo!