Data and Architecture, pt. 1: Ask the Right Questions

Your business needs to better use its data — but what does that mean? Context matters. Data governance, reporting and analytics, business intelligence. When you approach your data architecture, first start with asking the right questions that solve business challenges. What data does the sales team need to increase sales by x %? What data does the engineering team need to work on and innovate products that provide competitive advantage?

Similar questions have inspired companies to disrupt markets. Uber started with asking questions like, how can we optimize drivers at the right locations with the most customer demand? How can we give consumers the ability to call a cab with a click of a button? Tomasz Tunguz highlights similar examples in his book Winning with Data, where he states that “the best data-driven companies operationalize data.” To operationalize data is where companies can use the right data to rapidly change the way they operate.

In order to use the right data, ask the right questions. Gartner states exactly this, “Ask the right questions” in the 2015 article Big Data Analytics Failures and How to Prevent Them. Simply, what problem or toughest business challenge is your company trying to solve with data?

At the foundation and beginning practice of enterprise architecture, dating back to the 70’s with the Zachman Framework, the principle question related to data was “what data is needed to list the things most important to the business?” The framework focuses on the “what”, what is needed, in order to produce the right enterprise, data and technology models for the best use of data.

In order to harness the “Power of Data”, HBR suggests that companies start with the business problem in mind, and then “seek to gain insights from vast amounts of data.” With stating the specific business problem, companies can narrow the search and refine how they are going to find data-driven answers to their most challenging business problems.

Five features of JBoss EAP that will help get you production ready

JBoss Enterprise Application Server 7 has been out since June, and if you build and deliver using a Java EE environment and haven’t yet upgraded to EAP7, it’s time to make the jump.

Here’s a look at what’s new in JBoss EAP 7, what has changed since JBoss EAP 6, and how to get the most out of JBoss EAP 7 as your Java EE7 server.

Overview

JBoss EAP 7 is based on WildFly Application Server 10, which provides a complete implementation of the Java EE 7 Full and Web Profile standards. WildFly 10 does much to simplify modern application delivery based on containers and microservices.

JBoss EAP 7 features certified support for Java EE7 and Java 8 SE. The WildFly integration brings experimental Java 9 support, too. It also supports current development snapshots of Java 9, which is expected for release this fall.

The JBoss EAP 7 release is available for download from JBoss.org.

Continue reading “Five features of JBoss EAP that will help get you production ready”

How to get started with JBoss BPM

If you are evaluating, exploring or just plain interested in learning more about Business Process Management (BPM), then read onwards as this is what you have been waiting for.
While there are quite a few resources online, often they are focused either on community project code that is constantly changing or disjointed in such a manner that it is very difficult for you to find a coherent learning path.
No more.
Just a few months back, in June, the early access program for Effective Business Process Management with JBoss BPM kicked off. This book is focused on a coherent path of learning to get you started with BPM and it focuses on JBoss BPM Suite as the Open Source BPM solution of choice.
The first chapters have been put online and you can both read along as the book is written, while interacting with the author in the online forums.

Deal of the Day

Today only, half off the price of Effective Business Process Management with JBoss BPM, so head on over and grab yourself a copy using the code dotd081716au to get started with JBoss BPM Suite.

The deal will go live at Midnight US ET and will stay active for ~48 hours, running a little longer than a day to account for time zone differences.

If you would like to help out with socializing this news, here is a tweet you can cut and paste into your social networks:

 

See more by Eric D. Schabell, contact him on Twitter for comments or visit his home site.

 

Intro to In-Memory Data Grids

Some of the biggest technology trends aren’t necessarily about doing something new. Things like cloud computing (as an environment) and design patterns for the Internet of Things and mobile applications (as business drivers) are building on existing conceptual foundations — virtualization, centralized databases, client-based applications. What is new is the scale of these applications and the performance expected from them.

That demand for performance and scalability has inspired an architectural design called distributed computing. Technologies within that larger umbrella used distributed physical resources to create a shared pool for that service.

One of those technologies is the purpose of this post — in-memory data grids. It takes the concept of a centralized, single database and breaks it into numerous individual nodes, working together to create a grid. Gartner defines an in-memory data grid as “a distributed, reliable, scalable and … consistent in-memory NoSQL data store[,] shareable across multiple and distributed applications.” That nails the purpose of distributed computing services: scalable, reliable, and shareable across multiple applications.

Continue reading “Intro to In-Memory Data Grids”

Intro to Integration

Integration is one of those concepts that is easy to “know,” but becomes less obvious that more you try to define it. A basic, casual definition is making different things work together. The complexity, though, comes from the fact that every single part of that has to be broken down: what are the “things,” what are they doing that makes them “work together,” how are they working, and what is the goal or purpose of them working together. All of those elements can be answered differently for different organizations, or even within the same organization at different times.

An understanding of integration comes from looking at the different potential patterns that you can integrate and then defining the logic behind the integration so you can select the right patterns for your environment.

Integration Patterns

Integration itself is an architectural structure within your infrastructure, rather than an action or specific process. While getting various systems to work together has long been an IT (and organizational) responsibility, integration as a practice became more of a focus in the early 2000s. With emerging large-scale enterprise applications, there became a growing need to get those applications working together without having to redesign or redeploy the applications themselves. That push became integration.

Integration is subdefined by what is being integrated; these are the integration patterns.

There are different types of patterns, depending on perspective. There are patterns based on what is being integrated and then there are patterns based on the topology or design of the integration. Basically, it’s the what and the how.

Continue reading “Intro to Integration”

New styles of integration are the hallmark of Digital Transformation

New Styles of Integration 2

Shakeup your integration strategy to enable digital transformation, says VP & Gartner Fellow Massimo Pezzini. Pezzini asserts that it is not just about transforming and modernizing the infrastructure and the applications concerned.  Some of the fundamental concepts of integration need to be revisited and transformed as well.  Such systemic transformation punctuate the migration of  legacy environments to microservices and the cloud.  What may have worked in the past will no longer be applicable going forward.  “Integration is dead.  Long live integration,” screamed the title of one of the sessions at the Red Hat Summit 2016.  The session was making a point.  Integration, as we knew it a few years back, is dead.  Integration in the digital world has a long life in the decades ahead.  Join me as I walk through the new styles of integration that are the hallmark of digital transformation.

Continue reading “New styles of integration are the hallmark of Digital Transformation”

How To Import Any JBoss BRMS Example Project

This tips & tricks comes to you after I have been asked the following repeatedly over the last few weeks by users of the JBoss BRMS demos:

“How can I import the projects associated with the various JBoss BRMS demo projects into my own existing installation?”

What this means is that users want to have an example project in their personal installation of the product without using the projects installation process. This is certainly possible but not totally obvious to everyone.

Below I will walk you through how the various example projects for JBoss BRMS are setup, how the actual rules projects are loaded into JBoss BRMS when you set them up and why. After this I will show you how to extract any of the available rules projects for importing in to any previously installed JBoss BRMS server.

Figure 1: In JBoss BRMS open the Administration
perspective with menu options, Authoring -> Administration.

Background on how it works

The normal installation of a JBoss BRMS demo project that I have provided uses a template. This template ensures that the process is always the same; download, unzip, add products and run the installation script. After doing this, you are done, just fire up the JBoss BRMS for the adjusted experience where you open up the Authoring perspective to a pretty process designer with the demo project displayed for you to kick off a demo run.

These projects have a demo template that provides some consistency and you can read about how it works in a previous article.  For the initial installation run of any of these demo projects, a folder is copied from support/brms-demo-niogit to the installation at the location target/jboss-eap-{version}/bin/.niogit. 

Figure 2: To import a new project, open the Clone repository
from the menu Repositories. This will allow you to bring
in any rules project to your JBoss BRMS.

This folder contains all of the project and system Git repositories that are formatted for the version of the project you have downloaded. By installing this directory or complete repository, when JBoss BRMS starts up the first time, it will pick up the state I left it in when designing the experience around you using this demo project.

Get your hands on a specific rules project

The problem I want to help you with in this article is to show you how to extract only the rules project from one of these examples and import this into your own installation of JBoss BRMS.

Figure 3: Cloning a repository is how you import an
existing project, which requires the 
information shown.

The following list is the order you do the tasks, after which I will explain each one:

  1. Download any JBoss BRMS demo project and unzip (or clone it if you like).
  2. Log in to your own JBoss BRMS and open Administration perspective via menu: Authoring -> Administration.
  3. Setup the new rules project you want to import: Repositories -> Clone repository -> fill in details including import project URL
  4. Explore the new project in the Authoring perspective: Authoring -> Project Authoring
I am going to assume you can find a JBoss BRMS demo project of your liking from the link provided in step 1 and download or clone to your local machine.

I will be using the JBoss BRMS Cool Store Demo as the example project you want to import into your current JBoss BRMS installation instead of leveraging the standalone demo project.

In your current installation where you are logged in,  open the Administration perspective as shown in figure 1 by menu options Authoring -> Administration. This allows you to start importing any existing rules project. We will be importing the Cool Store rules project by using the feature to clone existing projects found in menu options, Repositories -> Clone repository as shown in figure 2.

Figure 4: Once the project has been imported (cloned), you
will receive this message in a pop-up.
This will produce a pop-up that asks for some information about the project to be imported, which you can fill in as listed below and shown in figure 3:
  • Repository Name: retail
  • Organizational Unit: Demos    (select whatever org you want to use from your system)
  • Git URL:  file:///[path-to-project-you-downloaded]/brms-coolstore-demo/support/brms-demo-niogit/coolstore-demo.git
Figure 5: Explore your newly imported rules project in the
authoring perspective within your JBoss BRMS installation.

The most interesting bit here is the Git URL, which is normally something hosted online, but this project we want to import is positioned locally in our filesystem, so we use a file based URL to point to it. Click on Clone button to import the project and you should see a pop-up that looks like figure 4 stating that you have successfully imported your project.

Now you can explore the new imported project in your authoring perspective and proceed as you desire with this project as shown in figure 5. This will work for any project I have put together for the field that is based on the standard template I use.

I hope this tips & tricks helps you to explore and enjoy as many of the existing rules examples offered in the current collection of demo projects.

 

See more by Eric D. Schabell, contact him on Twitter for comments or visit his home site.

Mike Piech and Rich Sharples on Facebook LIVE from Red Hat Summit

Rich Sharples, senior director of product management, and Mike Piech, vice president of marketing, got together for a half hour at the end of the Summit day today to discuss some of the major issues that have come out related to middleware this week. There have been some major announcements: the new microprofile project, the release of Red Hat JBoss EAP 7, the growth of microservices, and the recent acquisition of 3scale and what that means for API management in Red Hat Middleware.

As a quick summary, two of the major themes underscoring a lot of the announcements around JBoss, middleware, and Java this week relate to things that are micro: microservices and microprofile.

Microservices has been a subtext in many of the JBoss EAP 7 sessions and in the OpenShift sessions because this containerized, immutable, consistent environment is what makes microservices possible.Containers fundamentally enable microservices. You have an underlying runtime that is commensurate with the idea of “micro.” You can scale elastically, add instances to scale up and down. The opportunity to change things as an application travels from the desktop to the data center is much less. These are communicating systems, and that’s what container orchestration is. It coordinates these complex webs. we’re The application is the only thing that matters. Operations is there to support the application. I hit a build button and it goes through my CI/CD system, and it’s the same configuration in the environment.

However, like any application or project architecture, it’s more than “JBoss + OpenShift  = awesome microservices.” There has to be consideration and weight given to the application and the underlying technology to find a structure that fits. Microservices architecture isn’t about taking everything you’ve got and decomposing it into atomic services. It’s about having a range of sizes and services, depending on what you need. It is important to be conscious of the trade-offs that come from the increased complexity of the system. It really depends on the organization and the technology platforms they have what architecture is appropriate.

That need to understand and define the underlying framework to do microservices effectively is the theme of the second topic: the microprofile. There are defined specifications for different Java platforms (Standard and Enterprise) but both have the assumption of large-scale, full server architectures. New wave development, though, is increasingly small, with small services in those larger complex systems. What Java EE introduced to development was consistency and dependability. As we move into a new containerized world, we must do it responsibly, preserving the consistency and stability of previous environments. The microprofile project was created because a lot of vendors – Red Hat, IBM. Tomitribe, Payara – were just on a Slack chat, discussing what they needed to do for microservices and ways they could implement it. And then there was a lightbulb: maybe there’s something here. This is a chance to bring the whole Java community around a new architecture, with the strengths and discipline they’ve already developed.

Watch the whole video. For microprofile, you can join the Google group or check out the microprofile site for more information and emerging discussions.

MicroProfile – Collaborating to bring Microservices to Enterprise Java

This post was originally published at Red Hat Developers.

MicroProfile-Black

Today at the DevNation conference in San Francisco, Red Hat’s Mark Little was joined on-stage by Alasdair Nottingham from IBM, Theresa Nguyen from Tomitribe, Mike Croft from Payara and Martijn Verburg from the London Java Community to announce a new community collaboration – MicroProfile – whose goal is to make it easier for developers to use familiar Java EE technologies and APIs for building microservice applications.

Mark talked about some of the reasons Java EE has established itself as the dominant standard for companies building business-critical multi-tier enterprise applications, including :

  • An open standard platform that enables vendors to compete on implementation, price, or business model
  • A collaborative standard and process that is driven by many vendors and individual developers rather than a single vendor
  • Consistent and holistic vision for all architectural tiers of the application
  • A strong focus on adherence to the standard and compatibility between vendor implementations and versions of the specifications

As organizations start to think about the next generation of those business-critical applications, many of them are likely thinking about cloud-native, Linux containers and microservices, and how they evolve by using the technologies and skills they already have.

Red Hat, IBM, Payara, Tomitribe and the London Java Community believe that Enterprise Java is a solid foundation on which to build the next generation of applications and the MicroProfile (which may ultimately become a submission for a standard specification) can make it easier and provide portability between vendor’s implementations. The first release of the MicroProfile is expected to be available in September, with Red Hat’s implementation to be based on WildFly Swarm.

Red Hat continues to support those in the Enterprise Java community that are working hard to move Enterprise Java forward by pushing ahead on the evolution of Java EE. To emphasize this point, Red Hat has underlined its support for Java EE 8 and is committed to finishing the JSRs it leads, like CDI 2.0, and any necessary enhancements to Bean Validation while it also invests in the MicroProfile. We see synergy between the current Enterprise Java community efforts and the newly announced MicroProfile, which is born out of the same Enterprise Java community. To us it’s clear that the Enterprise Java community is forging ahead.

Red Hat understands that Enterprise Java has been successful for almost two decades thanks to the community collaboration that drove its evolution. Please join and participate in the MicroProfile effort and let’s all take the next step forward by cooperatively innovating to bring the microservice architecture to Enterprise Java.

Continuous Delivery to JBoss EAP and OpenShift with the CloudBees Jenkins Platform

If you are using JBoss Enterprise Application Platform (EAP) for J2EE development, the CloudBees Jenkins Platform provides an enterprise-class toolchain for an automated CI/CD from development to production.

The CloudBees Jenkins Platform now supports integrations with both Red Hat JBoss Enterprise Application Platform (EAP) and Red Hat OpenShift across the software delivery pipeline. This enables developers to build, test and deploy applications, with Jenkins-based continuous delivery pipelines in JBoss via JBoss EAP 7 or JBoss EAP 7 on OpenShift.  

Continue reading “Continuous Delivery to JBoss EAP and OpenShift with the CloudBees Jenkins Platform”