30 Dec 2017

Referring to ADF Faces component in EL expression

EL expressions are commonly used to specify attribute values of ADF Faces components on our page. It is interesting to know that we can use component keyword to refer to the component instance for which the EL expression is being evaluated. This is slightly similar to this in Java.

For example, in the following snippet the button's hint is evaluated as the button's text value and its visible attribute is going to be returned by a backing bean method accepting the component as a parameter:

<af:button text="#{theBean.buttonText}" id="b1"
 shortDesc="#{component.text}" visible="#{theBean.isVisible(component)}"/>

The backing bean method may look like this:
  public boolean isVisible(UIComponent button)
  {
    //Do something with the button
    ((RichButton) button).setIcon("images/awesomeIcon.jpg");


    //check button's attributes
    if (button. ...) 
      return true;
     else
      return false;

  }

This technique could be quite useful when it comes to rendering components inside some iterator (or list view or table, etc.) and we need to evaluate component's attribute value dynamically depending on the exact component instance.

That's it!


28 Dec 2017

Building Oracle ADF applications with Docker

Recently a good friend of mine was facing a regular problem with building an ADF application v.12.2.1.2 with the public Oracle Maven Repository. He asked me to check if it worked for me. Well... it didn't. So, there was some problem with the repository. In order to make the experiment clean and to avoid any impact on my working environment I decided to run the test in a docker container. And even though I could not help my friend (it simply didn't work throwing some dependency exception), as the result of this check I got a reusable docker image which serves as a preconfigured building machine for ADF applications (for v. 12.2.1.3 the Oracle Maven Repository worked fine at that moment).

This is what I did:

1. Pull and run an ubuntu Docker image

$: docker run -it --name adfbuilder ubuntu


2. Install Java in the adfbuilder container

apt-get install software-properties-common python-software-properties
add-apt-repository ppa:webupd8team/java
apt-get update
apt-get install oracle-java8-installer

3. Install Maven in the adfbuilder container

Just download maven binaries and unzip them in some folder and copy into the container:

docker cp ~/Downloads/apache-maven-3.5.2 adfbuilder:/opt/apache-maven-3.5.2

Update PATH environment variable in the container

export PATH=$PATH:/opt/apache-maven-3.5.2/bin

Having done that, the mvn should be available. Run it in the container and it will create a hidden .m2 folder in the user's home.

4. Configure Maven in the adfbuilder container to work with Oracle Maven Repository

Just put in the hidden .m2 folder 

 docker cp settings.xml adfbuilder:/root/.m2/settings.xml

settings.xml file with the following content:
<settings xmlns="http://maven.apache.org/SETTINGS/1.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
          xsi:schemaLocation="http://maven.apache.org/SETTINGS/1.0.0                       https://maven.apache.org/xsd/settings-1.0.0.xsd">
  <servers>
    <server>
      <id>maven.oracle.com</id>
      <username>eugene.fedorenko@flexagon.com</username>
      <password><MY_PASSWORD></password>
      <configuration>
        <basicAuthScope>
          <host>ANY</host>
          <port>ANY</port>
          <realm>OAM 11g</realm>
        </basicAuthScope>
        <httpConfiguration>
          <all>
            <params>
              <property>
                <name>http.protocol.allow-circular-redirects</name>
                <value>%b,true</value>
              </property>
            </params>
          </all>
        </httpConfiguration>
      </configuration>
    </server>
  </servers>
  <profiles>
    <profile>
      <id>main</id>
      <activation>
        <activeByDefault>true</activeByDefault>
      </activation>
      <repositories>
        <repository>
          <id>maven.oracle.com</id>
          <releases>
            <enabled>true</enabled>
          </releases>
          <snapshots>
            <enabled>false</enabled>
          </snapshots>
          <url>https://maven.oracle.com</url>
          <layout>default</layout>
        </repository>
      </repositories>
      <pluginRepositories>
        <pluginRepository>
          <id>maven.oracle.com</id>
          <url>https://maven.oracle.com</url>
        </pluginRepository>
      </pluginRepositories>
    </profile>
  </profiles>
</settings>
Basically, this is enough to compile a Maven-configured ADF application in the container. We need to make sure that there is an access to the source code of our application from the container. This can be done either by mapping a source folder to be visible from the container or just by coping it into the container.

docker cp /mywork/MySampleApp adfbuilder:/opt/MySampleApp

Having done that, we can run the following command to get the application compiled:

docker exec adfbuilder mvn -f /opt/MySampleApp/pom.xml compile

5. Copy JDeveloper  binaries into the container
As we want to go beyond this point and be able not only to compile, but to produce deployable artifacts (ears, jars, etc.), we will need to put JDeveloper  binaries into the container (basically, maven will need ojdeploy).  I have just copied  Oracle_Home folder from my Mac to the container:

docker cp /My_Oracle_Home adfbuilder:/opt/Oracle_Home

So, now I am able to build a ear for my application in the container:

docker exec adfbuilder mvn  -f /opt/MySampleApp/pom.xml package -DoracleHome=/opt/Oracle_Home

For the first run it may ask you to provide you the path to your JDK

[INFO] Type the full pathname of a JDK installation (or Ctrl-C to quit), the path will be stored in /root/.jdeveloper/12.2.1.3.0/product.conf
/usr/lib/jvm/java-8-oracle

6. Commit changes to the container
The final thing we need to do is to commit changes to the container:

docker commit adfbuilder efedorenko/adfbuilder

This will create a new ubuntu image containing all changes that we applied. We can easily run that image wherever we want across our infrastructure and use it as a building machine for ADF applications. The beauty of it is that we can run it in a cloud like Docker Cloud (backed by AWS, Microsoft Azure, Digital Ocean, etc.) or Oracle Container Cloud Services or whatever you prefer. With this approach servers in the cloud build your application for you which in general is a quite resource-consuming job.

Thant's it!





  

30 Nov 2017

Creating a View Object Row with ADF Bindings CreateInsert action

In this short post I am going to highlight a small pitfall related to a very common approach to create a new record in a task flow.
Let's consider an example of a simple task flow creating a new VO row, displaying that row on a page fragment and committing the transaction if the user clicks "Ok" button:



The CreateInsert method call has been just dragged&dropped from the data control palette. The thing is that if the user does not update any VO attributes in view1 page fragment, the Commit method call will do nothing. The new row will not be posted to the database.
The reason for this behavior is that the ADF bindings CreateInsert action always creates an entity in Initialized state, which is ignored by the frameworks while committing the transaction. Even if the entity has default values, or it's Create method is overridden setting the attribute values, it doesn't matter, the entity will be still in Initialized state after the CreateInsert action. Afterwords, if any VO attributes are modified, the entity gets the New status and the framework will post changes (preform insert statement) while committing the transaction. This behavior is quite logical as in most cases task flows like that create a view object row to get it updated by the user before submitting to the database. However, most cases are not all and if it is needed we can always implement a custom VO method creating/inserting a new row and invoke it instead of the standard CreateInsert action. Like this one:

  public void addNewEmployee() {
    EmployeeViewRowImpl  row = (EmployeeViewRowImpl) createRow();
    insertRow(row);
  }


That's it!

16 Nov 2017

Continuous Delivery of ADF applications with WebLogic Shared Libraries

Introduction
There is a pretty popular architecture pattern when ADF applications are built on top of shared libraries. So the main application is being deployed as an EAR and all subsystems are implemented within shared libraries that can be independently built and deployed to WebLogic as JARs in "hot" mode without downtime. The advantages of this approach seem to be obvious:
  • It decomposes the application implementing the concepts of modularization and reuse
  • CI/CD process might be much faster as only one library is going to be rebuilt/redeployed
  • There is no downtime while redeploying a shared library
It looks so cool that people choose this architecture pattern for their new projects and they are pretty happy with the decision they made while implementing the application. They get even happier when they go live to production as they can easily fix most of the bugs and implement new requirements avoiding full redeployment and without any downtime. 
Definitely, before getting to production any change (and therefore a corresponding shared library) should be deployed and tested at the previous environments such as QA, UAT, etc. 
In a while nobody knows exactly what versions of shared libraries are deployed at each environment. It's getting a bit tricky to support the application and implement new changes in this situation as even though it works on this environment there is no guarantee it's going to work on the next one as the combination of shared libraries could be different. If it is a big application and there are many shared libraries, this might become a nightmare and pretty often people just give up getting back to full redeployment of everything and eventually to a monolith EAR. It's not that cool, but at least they can sleep again now.

Solution
In this post I am going to show how to put things in order and build a continuous delivery process of an ADF application built on top of shared libraries with FlexDeploy. FlexDeploy is a rapidly growing Automation and DevOps solution and if you want to learn what it is all about feel free to visit the  website. Here I am going to focus on how FlexDeploy helps with shared libraries by introducing the concepts of a snapshot and a pipeline.  

Snapshot is a set of deployable artifacts representing the entire system. If either of the artifacts is to be rebuilt a new snapshot is going to be created containing a new version of this artifact and the previous versions of the rest of artifacts. In our case a snapshot would contain a EAR for the main ADF application and JARs for the shared libraries.

In order to create snapshots for our application FlexDeploy should know what it is all about and what 
projects it consists of.  There is a notion of Release in FlexDeploy which serves as a bucket of projects that should be built into snapshots and deployed across environments all together as a single unit. 
















In our example there are three projects - one for the main application and two for departments and employees task flows, deployed as shared libraries. Each project is configured separately in FlexDeploy and each project "knows" how its source code can be fetched, how to be built and deployed (FlexDeploy uses workflows for building and deploying, but that's another big story which is way beyond this post).














Having all that defined, whenever a developer pushes a code change for any of the projects included in the release, FlexDeploy builds a new snapshot. It rebuilds only those projects (producing ears and jars) that have changed, the rest of the artifacts are included in the new snapshot as is. 

  

Ok, now we can build snapshots and let's deploys them across environments. The release definition is referring to a pipeline. 

Pipeline is an approach that guarantees deploying of the entire snapshot across environments in a strict predefined order. It means that this snapshot (in other words this combination of ear/jar versions) can be deployed only in this order Dev->QA->Prod (if a pipeline is defined in this way). It just can't get to Prod if it is not successful at Dev and QA.  A pipeline consists of stages referring to environments, each stage consists of gates (approvals, test results, etc. meaning that a snapshot should pass all gates before being processed at this environment) and steps (deploy, run automated tests, notify, manual steps, ...). 




So,  basically, the deployment is just a pipeline step within a pipeline stage (environment). This step is smart enough to redeploy only those artifacts that have changed (unless the step is configured to perform "force" deploy). FlexDeploy tracks what artifact versions have been deployed at every environment.



As a conclusion I would say that when using FlexDeploy as a DevOps solution for ADF applications with shared libraries we gain all benefits of this architecture pattern on one hand, and on the other hand we keep things in order, knowing exactly what combination has been deployed across environments, what has been tested and ready to go live and what has failed.  

That's it!










31 Oct 2017

Implementing Dynamic Dialog Handler with Functional programming

In my previous post I mentioned a common use case when we need to programmatically check if the current transaction is dirty and notify a user about that before doing something. Like "You have unsaved changes that will be lost, do you want to continue?".
Suppose that we need to notify the user about dirty transaction in many places across the application, when navigating from one view to another, when clicking Search button, when invoking a business service method, etc. So, in every single scenario we need to do different things after the user confirms that they want to proceed. It means that our dialog listener should know somehow what it was all about and what to do next.

The solution could be to add a custom attribute to the af:dialog component pointing to a function which is going to be invoked when the user clicks "Yes" on the dialog:

<af:popup id="pDirtyTransaction" contentDelivery="lazyUncached">
  <af:dialog title="Warning" type="yesNo" closeIconVisible="false"
             id="dDirtyTransaction"
    dialogListener="#{theBean.dirtyTransactionDialogListener}">
     <af:outputText value="You have unsaved changes, do you want to continue?"
                    id="ot1"/>

     <f:attribute name="dialogHandler" value=""/>                   

  </af:dialog>
</af:popup>


In that case the dialog listener may look like this:

public void dirtyTransactionDialogListener(DialogEvent dialogEvent) {       
  Map attrs = dialogEvent.getComponent().getAttributes();
  Consumer<Boolean> dialogHandler = (Consumer) attrs.get("dialogHandler");
  if (dialogHandler != null) {
      dialogHandler.accept(dialogEvent.getOutcome() == DialogEvent.Outcome.yes);
      attrs.put("dialogHandler",null);
  }                   
}

We expect here that dialogHandler attribute points to an object implementing Consumer functional interface.

There is a method in our utils showing the popup with the dialog:

public static void showDirtyTransactionPopup(Consumer dialogHandler) {
  if (dialogHandler != null) {
      JSFUtil.findComponent("dDirtyTransaction").getAttributes().
              put("dialogHandler",dialogHandler);
  }

  RichPopup popup =
      (RichPopup) JSFUtil.findComponent("pDirtyTransaction");
  popup.show(new RichPopup.PopupHints());
}


Let's use this approach in a simple scenario. There are two view activities in our task flow View1 and View2. The user clicks a button to navigate from one view to another. While navigating we need to check if the current transaction is dirty and if it is ask the user if they want to proceed. We can leverage the power of Java 8 Lambda expressions and implement the button action listener  like this:

public void buttonActionListener(ActionEvent actionEvent) {

  if (Utils.isTransactionDirty()) {       

       Utils.showDirtyTransactionPopup((yesOutcome) -> {          

           //the code below will be invoked by the dialog listener
           //when the user clicks a button on the dialog                                                                     
           if ((Boolean) yesOutcome) {
               //the user has agreed to proceed,
               //so let's rollback the current transaction
               Utils.getCurrentRootDataControl().rollbackTransaction();            

               //and queue an action event for this button again
               new ActionEvent(actionEvent.getComponent()).queue();
           } });

   } else
       //just navigate to View2
       Utils.handleNavigation("goView2");
}

Basing on this technique we could implement a declarative component serving as a dialog with a dynamic content and a dynamic handler.

That's it!






29 Oct 2017

Checking ADF BC Transaction Status

There is a very common use case when we need to programmatically check if the current transaction is dirty and notify a user about that. The most common approach is to get an instance of the current data control frame or a data control and check their isTransactionDirty() and isTransactionModified() methods.

For example, like this:

    private boolean isTransactionDirty() {
        BindingContext context = BindingContext.getCurrent();
        DataControlFrame dcFrame = context.dataControlFrame();               
        return dcFrame.isTransactionDirty();
    }

Or like this:

    private boolean isTransactionDirty() {
        BindingContext context = BindingContext.getCurrent();
        DCBindingContainer binding = (DCBindingContainer) context.getCurrentBindingsEntry();
        DCDataControl dc = binding.getDataControl();      

        return dc.isTransactionDirty();       

        //or       

        return dc.isTransactionModified();       

        //or       

        return dc.isTransactionDirty() || dc.isTransactionModified();       
    }

Note, that in the second example both isTransactionDirty() and isTransactionModified() methods are used. In the good old days, when people worked with 11g, the isTransactionDirty() method checked the underlying model if it was dirty (basically if ADF BC transaction was dirty). The isTransactionModified() has never done that, it's been always checking its internal flag only which is useful when it comes to a non-transactional data control (e.g. been data control). Having those both methods was nice as it gave some flexibility and you could use either of them (or both) depending on what you were actually checking.

Nowadays (12cisTransactionDirty() is combined with isTransactionModified(), so it checks the internal flag (which is set up whenever any data bound value is changed) and the underlying model transaction and returns true if either of them is true. Having said that, you are not able anymore to use isTransactionDirty() to check if ADF BC transaction is dirty.

Let's say there is a transient view object and you use it on your screen for some temporary values (e.g. implementing custom filtering or a form with input values for a business service method). Since those values are data bound (even though they have nothing to do with ADF BC entity cache) the framework will mark the internal data control flag as dirty whenever the values are changed. So, isTransactionDirty() method in 12c is going to return true. The user didn't do anything bad yet, and we are scaring them with the notification about dirty transaction.

The solution is to override the method in the data control. You can see how to tell the framework to use a custom data control here. So, in our custom data control we are going to override isTransactionDirty() method:

    //We consider transaction as dirty only if BC transaction is dirty
    //all manipulations with transient VOs/attributes should not matter
    @Override
    public boolean isTransactionDirty()  {
       ApplicationModule am = getApplicationModule();
       return (am != null
                  && am.getTransaction() != null
                  && am.getTransaction().isDirty());
    }


That's it!




30 Sept 2017

Right getters for "Boolean" and "boolean" values

In this post I would like to highlight a small pitfall that sometimes spoils ADF/JSF developers lives.  It happens when an EL expression on a page is referring to a boolean property of a managed bean or an object and a getter for this property is declared in a wrong way. This leads to either PropertyNotFound or  PropertyNotReadable exceptions.

Let's consider a simple example. There is a managed bean with the following methods:

  public boolean isPrimitiveValue()
  {
    return true;
  }


  public Boolean isObjectValue()
  {
    return Boolean.TRUE;
  }


There is a page with a couple of buttons referring to the he managed bean:

 <af:button text="button 1" id="b1" rendered="#{theBean.primitiveValue}"/>
 <af:button text="button 2" id="b2" rendered="#{theBean.objectValue}"/>

It works well for the first button, but it doesn't for the second one raising a PropertyNotFound exception like "...The class 'com.cs.adfpractice.view.TheBean' does not have the property 'objectValue'...".

The thing is that the EL engine can't resolve a getter starting with "is" for the Boolean type returning an object. It works only for the primitive boolean type.

If we change the second getter like this:

  public Boolean getObjectValue()
  {
    return Boolean.TRUE;
  }

It will work perfect.

That's it!


10 Sept 2017

Using Custom View Objects with Entity Associations

While implementing business logic at ADF BC layer we leverage entity associations in our code as a convenient approach to manipulate entities that are related to each other by design. For example, there is DepartmentEmployeesEL association linking Department and Employee entities. In Department entity implementation class there is an association accessor returning a row iterator with employees working for this department:
  public RowIterator getEmployees()
  {
    return (RowIterator) getAttributeInternal(EMPLOYEES);
  }

This accessor can be used in a method iterating over all associated employees and performing some actions with each of them:
  private void processEmployees()
  {
    RowIterator employees = getEmployees();
    while (employees.hasNext())
    {
      EmployeeImpl employee = (EmployeeImpl) employees.next();
      // do something with employee
    }
  }
In order to retrieve the list of employees the framework builds an internal view object according to the association definition. However, there is a way to control how this view object is going to be created, for example we want to specify a desired sorting order or add an extra where clause to the VO query. We can tell the framework to use our custom view object definition instead of using a default one:






Having done that we have a full control on how the rows returning by the association accessor are going to be retrieved from the database.

That's it!

20 Aug 2017

An easy way to clean ADF BC entity cache

It's time to get back to posting. In this short post, the first one after a long break,  I am going to show an easy way to clean up entity cache of a specific entity definition and to force dependent view objects to get re-executed.

Let's say there is a dashboard page containing lots of various charts, diagrams, lists and tables. All that fancy stuff is based on a number of view objects. Let's assume these view objects represent data in different ways from a database table containing some billing information BillingLog. So, there is an entity BillingLogEO and all dashboard VOs are based on this entity. When it comes to dashboards, a common use-case is to get the dashboard refreshed either manually or automatically. We could implement that just by running over all dashboard view objects and re-executing them one-by-one explicitly, or we can just tell the framework to clean up entity cache of BillingLogEO in our application module implementation class:
getDBTransaction().clearEntityCache("model.entities.BillingLogEO");
This will clean the specified entity cache in the current transaction and clean VO caches of all dependent view objects which will force them to get re-executed.

That's it!



30 Apr 2017

Shared View Object with Range Paging access mode

Recently I came across an interesting issue related to shared view object instances with range paging access mode.  So, there is a VO instance defined in a shared application module with the pagination feature on:



This shared VO instance is used in a view accessor



Well... It doesn't work.  The LOV's search dialog always shows only first (~50) records. It is impossible to scroll down and see the rest of records.

I am not sure if it is documented somewhere, but it makes sense. A shared view object is supposed to share its row sets with many clients. Obviously, with range paging access mode this concept just can't work as each client could have its own active range.

Actually, this is not a common use-case to switch on Range Paging access mode for shared VOs. Usually, range paging is used for VOs returning a great number of records, and this is not common for shared VOs providing some common reference data.  But, anyway,  just be aware of this behavior.

That's it!


25 Mar 2017

Passing Values to JavaScript from Managed Bean

In this simple post I am going to consider a common use case when we need to invoke a JS function from a managed bean method and this function consumes some value provided by a managed bean. Let's have a look at what options we have to pass this value from a Java bean to a JS function.

The easiest and the most obvious option is to pass the value as a parameter of the JS function:

JavaScript function:
        function alertParamValue(paramValue)
        {
          alert(paramValue);
        }

Managed bean method:
  private void renderScript(String script)  {
    FacesContext fctx = FacesContext.getCurrentInstance();
    ExtendedRenderKitService erks = null;
    erks = Service.getRenderKitService(fctx, ExtendedRenderKitService.class);
    erks.addScript(fctx, script);
  }

  public void paramButtonListener(ActionEvent actionEvent) {
    StringBuilder script = new StringBuilder();
    script.append("alertParamValue('came from managed bean');");
    renderScript(script.toString());   
  }

However, sometimes it might happen that passing a parameter to a JS function is not the best option due to complicated implementation of the function and it would require much effort to pass parameter's value to the exact place in the code where this value is used. In this case the JS function may refer to a "helper" JS function returning parameter's value. And this JS function is going to be rendered dynamically in a managed bean:

Java Script function:
        function alertFunctionValue()
        {
          alert(renderedFunction());
        }

Managed bean method:
public void functionButtonListener(ActionEvent actionEvent) {
 StringBuilder script = new StringBuilder();    
 script.append("function renderedFunction() {return 'came from managed bean'}");
 script.append("alertFunctionValue();");
 renderScript(script.toString());
}

Another solution for this case could be implemented by means of JavaScript variables:

Java Script function:
        var varValue;

        function alertVarValue()
        {
          alert(varValue);
        }

Managed bean method:
  public void varButtonListener(ActionEvent actionEvent)  {
    StringBuilder script = new StringBuilder();
    script.append("varValue = 'came from managed bean';");
    script.append("alertVarValue();");
    renderScript(script.toString());
  }

The sample application for this post is available here. It requires JDeveloper 12.1.3.

That's it!