The concept of agile integration, depending on whom you ask, may appear as a contradiction in terms. Integration is a concept that used to be associated with “slow,” “monolithic,” “only to be touched by the expert team,” etc.. Big and complex legacy enterprise service buses connected to your applications were the technology of choice at a time when agility was not a requirement, when the cloud was barely an idea, when containers were associated with maritime shipping and not with application packaging and delivery.
Can the principles of agile development be combined with those of modern integration? Our response is yes, and we call it agile integration. Let me show you what it is, why it is important, and what we at Red Hat are doing about it.
Software development methodologies have evolved rapidly in the last few years to incorporate innovative concepts that result in faster development cycles, agility to react to changes and immediate business value. Development now takes place in small teams, changes can be approved and incorporated fast to keep track of the changing demands of the business, and each iteration of the code has a product as the ultimate result. No more need for longer development cycles and never-ending approvals for changes. And importantly, business and technical users join forces and collaborate to optimize the end result.
In addition, modern integration requires agility, cloud-readiness, and support of modern integration approaches. In contrast with the legacy, monolithic ESBs, modern integration is lightweight, pattern-based, scalable, and able to manage complex, distributed environments. It has to be cloud-ready and support modern architectures and deployment models like containers. It also has to provide integration services with new, popular technologies, like API management, which is becoming the preferred way to integrate applications and is at the core of microservices architectures. And support innovative and fast evolving use cases such as the Internet of Things (IoT).
Continue reading “Meet application integration in the times of hybrid cloud”
EclipseCon France is taking place this week in Toulouse, France (June 13-14, 2018) and it’s offering a great lineup of top-notch sessions on nine different tracks, from IoT to cloud and modeling technologies. This year, there is even a dedicated track for “Microservices, MicroProfile, EE4J and Jakarta EE,” which is covering topics such as Istio, 12-factor apps, geoscience, machine learning, noSQL database integration, cloud-native application development, security, resilience, scalability, and the latest statuses of the Jakarta EE and MicroProfile open source specification projects. Under this track, we are hosting two sessions:
But we are also delivering other interesting sessions under the “Reactive Programming” track:
Under the “IoT” track:
Under the “Eclipse IDE and RCP in Practice” track:
And, under the “Cloud & DevOps” and “Other Cool Stuff” tracks:
For those of you that will be at the conference, we invite you to attend the sessions above and to stop by the Red Hat booth to learn how Red Hat can help your organization solve your IT challenges (and get your swag too!). And for those of you that would like to learn more about Red Hat offerings in relation to the topics above, please visit the following links:
Red Hat OpenShift Application Runtimes is a collection of cloud-native application runtimes that are optimized to run on OpenShift, including Eclipse Vert.x, Node.js, Spring Boot, and WildFly Swarm. In addition, OpenShift Application Runtimes includes the Launch Service, which helps developers get up and running quickly in the cloud through a number of ready-to-run examples — or missions — that streamline developer productivity.
New Cache Booster with JBoss Data Grid integration
In our latest continuous delivery release, we have added a new cache mission that demonstrates how to use a cache to increase the response time of applications. This mission shows you how to:
- Deploy a cache to OpenShift.
- Use a cache within an application.
The common use case for this booster is to cache service result sets to decrease latency associated with data access as well as reduce workload on backend service. Another very common use case is to reduce the data volume of message send across in distributed system.
Continue reading “Red Hat OpenShift Application Runtimes: Delivering new productivity, performance, and stronger standards support with its latest sprint release”
Red Hat’s offering in cloud-native application development has just taken another step forward with the announcement of supported Node.js. Conor O’Neill from our partner nearForm shares his thoughts on the role that Node.js and Red Hat OpenShift Application Runtimes (RHOAR) will take in Red Hat’s market leadership in Cloud-Native application development, modernization and migration.
Read more here: Red Hat makes Node.js a first-class citizen on OpenShift with RHOAR, by Conor O’Neill, nearForm
Luis I. Cortes. Senior Manager, Middleware Partner Strategy – @licortes_redhat
In today’s digital world, software strategy is central to business strategy. To stay competitive, organizations need customized software applications to meet their unique needs — from customer engagements to new product and services development. Drawn-out development projects are no longer acceptable, given business demands. Therefore, the need to speed up application development, testing, delivery, and deployment is no longer optional but a must-have competency.
At the same time that developers are confronting this challenge to deliver solutions more quickly, they are also facing the most diverse technology ecosystem in the history of computing. To address this challenge, development teams must modernize architecture, infrastructure, and processes to deliver higher-quality applications with greater agility.
Cloud native development is an approach to building and running applications that fully exploits the advantages of the cloud computing model. Cloud native development multidimensionality involves architecture, infrastructure, and processes based upon four key tenets:
- Services-based architecture: could be microservices or any modular loosely coupled model for independent scalability and flexibility of maintenance and polyglot language runtimes.
- Containers and Docker image: as the deployment unit and self-contained execution environment with consistency and portability across cloud infrastructures.
- DevOps automation: implementing processes and practices and instrumentation of development to test deployment of applications.
- API-based design: The only communication allowed is via service interface calls over the network. No direct linking, no direct reads of another team’s data store, no shared-memory model with an outside- in perspective.
Continue reading “Cloud Native Application Development – Adopt or Fail”
Sometimes we would like to change the behavior of an application fast. I mean, really fast.
Traditional development cycles for enterprise applications take weeks if not months for a new version to be ready in production. Even in the world of DevOps, containers, and microservices, where we can spin up new versions of an app in days, or even hours, we need to go through development cycles that are too far away from the business users.
Welcome to the world of business rules and decision services, along with low code development.
Continue reading “A DevOps approach to decision management”
As we described in an earlier blog, microservices are mini-applications which are devoted to a single, specific function. They are discrete (independent of other services in the architecture), polyglot with a common messaging or API interface, and they have well-defined parameters.
As application development and IT operations teams have started streamlining and speeding up their processes with methodologies like Agile and DevOps, they have increasingly begun treating IT applications as microservices. This breaks up potential bottlenecks, reduces dependencies on services used by other teams, and can help make IT infrastructure less rigid and more distributed.
One area where we are seeing this looser, more distributed approach to service development is with business rules.
Business rules and processes in a traditional structure tend to be centralized, with the complete set of functionality defined for all workflows. The problem with centralization is because there is a single, centralized collection of business rules, any changes to one set of rules can affect many other sets, even those for different business functions.
Micro-rules essentially treat each functional set of rules as its own service — well-defined, highly focused, and independent of other rules.
Figure – Function rule sets as micro-rules
Continue reading ““Micro-rules,” event-driven apps, and Red Hat Decision Manager”
The latest edition of the white paper titled “The Business Value of JBoss Enterprise Application Platform,” which summarizes the benefits and value that Red Hat customers are seeing by moving to JBoss EAP, has been released.
As the paper states, “IDC interviewed organizations that are using JBoss EAP to develop and run various business applications. These study participants explained that they not only have significantly reduced platform costs with JBoss EAP but also are supporting important organizational IT initiatives such as containerization, microservices, and hybrid cloud use.” The interviewed participants varied in size from medium to large organizations and belonged to a set of diverse vertical industries.
Some of the results from this study are:
- 481% 3-year ROI
- 8-month payback period
- $50K USD average annual benefits per 100 users
- 43% more number of new application released per year
- 21% faster time to deliver new applications
- 38% more number of new features released per year
- 74% less productive hours lost due to unplanned downtime per year
Continue reading “The Business Value of JBoss Enterprise Application Platform – latest white paper by IDC”
The rise of microservices and containerized environments comes with its own set of demands and challenges for developers, who are being asked to quickly and reliably bring new features to market and adhere to strict best practices.
Thomas Johnston from our partner Shadow-Soft recognizes their pain points and offers the three benefits that RHOAR offers to speed up microservices development.
Read more here: Microservices slowing you down? Streamline the orchestration process with Red Hat OpenShift Application Runtimes (RHOAR)
There is this myth that Java EE containers aren’t fast and agile enough to build modern applications. Although this may be true for some app server vendors, it’s definitely not the case for Red Hat JBoss Enterprise Application Platform (JBoss EAP). JBoss EAP is a modern application platform that includes a modular structure that allows service enabling only when required, improving startup speed.
With this in mind, we decided to run a comparison between JBoss EAP and other technologies that are touted to be the best for cloud-native applications. Not to our surprise, here are the results:
Note: The performance tests above were produced without any performance optimization, and if you run the tests yourself, you might get different results depending on your hardware and memory. The conclusion from the above results is that JBoss EAP is not slower and does not use more memory than the other runtimes.
When comparing a JBoss EAP instance running Java EE Web Profile app, a JBoss EAP running a Spring application, Tomcat and Spring Boot, you can see that in our tests, JBoss EAP running Java EE Web Profile was faster, used less memory, and had the highest throughput under load. You can find the entire test suite and source code at the following location:
Continue reading “Red Hat JBoss EAP – a platform for current and future workloads”