Quantcast
Channel: ATeam Chronicles
Viewing all 36 articles
Browse latest View live

SAML, REST, smart phones and you

$
0
0

(or Smart devices, not so smart protocols)

I’ve been working on and off with a customer on a project that involves all sorts of cool buzzwords – iPhone/Android/Blackberry Apps as clients, using REST to invoke Web Services, authenticating via SAML. While I can’t go into the details or reveal too much about the project there is one line of discussion that is really interesting.

First the background:

A thick client, running on a smartphone, will do some sort of handshake with one web server to authenticate the user.
Once the user is authenticated that server will issue the user a SAML Assertion.
The client will then use the SAML assertion to authenticate to a different server and will send REST-style requests to invoke services on that server.

So something like this:

Screen+shot+2010-03-31+at+4.20.20+PM

This begs the question why not just use a conventional web SSO product like Oracle Access Manager? An excellent question, the short version of which is that the Authentication Server and the REST Server are run by two different companies and don’t share any infrastructure (a more common situation than you might think).

SAML actually solves a whole bunch of painful problems in this architecture – the AuthN server can sign the SAML Assertion to prove its validity and protect it from alteration and encrypt it to prevent the user from even seeing its contents. SAML also allows the AuthN server to send additional information about the user (i.e. attributes) in an extensible way – and additional attributes can be added later without needing to change any infrastructure, communication protocols or even the client.

Did you fill up your Buzzword bingo card yet?

So moving on to the problems…

Transmitting the SAML Assertion
If we were using SOAP to go from the device to the server WS-Security would have solved all of our problems. That standard spells out exactly how to attach a SAML assertion to a SOAP message so that both the client and server can understand. Unfortunately almost all “smart” devices lack a full SOAP stack. Further complicating matters is the fact that REST is a (air quote) “standard” intended to be a very lightweight way to send requests to a server and get back a structured response. Because it’s intended to be so much simpler than SOAP there’s very little (read no) standards around things like authentication, encryption, signing or any of the things that make SOAP a bit complicated at times.

All of which is just a long way to say that you’re basically on your own figuring out how to use SAML with REST.

There are a few obvious options of how to use SAML with REST.

  1. Send the SAML assertion to the server and swap it for a cookie. Your deployment then becomes nothing more than a standard web SSO situation and your application doesn’t need to worry about the SAML bits.
  2. Send the SAML assertion in every request as part of the POST data. This places the responsibility for parsing the SAML assertion into your application logic or something that can see and handle the HTTP POST data stream.
  3. Send the SAML assertion in every request as an HTTP header. This is a slight variant of #2 that is more similar to SOAP’s WS-Security model where the authentication information is separated from the actual input/output parameters of the call.

I like option 1 because it pushes handling the SAML assertion out of scope, or in other words into an S.E.P. On the other hand having a thick client interacting and cooperating with a web SSO solution introduces a whole raft of other issues including properly handling things like idle and session timeouts, dealing with redirects, and a long list of others. Web SSO products were designed to interact with web browsers and their ‘on the wire’ behavior can be difficult to understand from an HTTP client’s perspective. I’m not convinced that this is the best solution to our problem so on to options 2 and 3.

Option 2 and 3 are nearly identical – differing only in where the assertion goes in the HTTP request. That subtle difference is actually kinda a big deal and after thinking about it for a while I think I vastly prefer option 3 to option 2. Besides the logical separation of authentication and inputs I have a few other reasons, such as the fact that POST data can only generally be consumed once which is really important for my next trick.

You probably know about Servlet Filters, and if you’ve been reading this blog for a while you probably know about WebLogic’s security framework (often called the SSPI framework). What you may not know about is how to put them together into a Servlet Authentication Filter. Basically you write a Servlet Filter that takes on responsibility for getting the authentication token and then ask WebLogic to go call the authentication provider for you. If everything works out WebLogic goes ahead and establishes a session for you. Then when your actual service wants to know who is invoking the service it can ask by calling weblogic.security.Security.getCurrentSubject().

No fuss no muss. And most importantly you don’t have to commingle service logic with any code to deal with SAML, encryption keys, XML parsing or anything else unrelated to actually doing your actual work!

Session Management
One of the concerns with sending the SAML assertion along with every request is the performance impact of the XML and crypto operations. If you are invoking a simple service (hello world for example) the overhead of all of the SAML seems like it might be awfully expensive. If you had to pay that price with every request the overhead would quickly eat up your CPU cycles grinding even a reasonably fast machine to a halt under load.

Thankfully WebLogic’s designers thought about this very problem.

The first and most obvious solution is to act just a little bit more like a browser. When you authenticate to Weblogic it automatically creates a session for you and sends a cookie (usually named JSESSIONID) back to your browser. If you include that cookie with subsequent requests there’s no need to authenticate again. So if you smarten up the client so that it handles cookies gracefully you’ll avoid WebLogic having to re-parse and validate your SAML assertion. In fact if I’m reading the relevant iPhone SDK docs (just to cite one example) correctly I think the iPhone SDK handles cookies properly for you automatically by default! Android includes Apache HttpClient which makes cookie handling almost trivial. And as for Blackberry, well it’s J2ME which means you’ll have to do cookie parsing by hand; which, while unfortunate, isn’t the end of the world.

As long as you do the right thing with the cookies coming from WebLogic your session will be fine. If something happens to your session (e.g. the server gets rebooted, you get shunted off to another server that doesn’t know about your session, your session times out, etc) the auth filter will automatically reestablish a session as long as your SAML assertion is still OK.

But that’s only one part of the solution. If you disable WebLogic’s issuance of cookies or you choose to not handle cookies in your thick client’s code WebLogic has still got your back.

Weblogic’s Identity Assertion Cache
Decrypting a chunk of XML, parsing it, and extracting some data takes some CPU cycles, but isn’t all that slow. Searching an LDAP directory to find a user, then doing another search to chase down all of the group memberships on the other hand takes real, measurable clock time. Some of the time is because you’re doing a search and some is because you’re going over a physical wire to talk to the LDAP server and those wires (AFAIK) are still subject to the laws of physics.

The WebLogic docs describe the setting in some detail. The Javadoc for the Authentication Provider says:

The assertIdentity() method of an Identity Assertion provider is called every time identity assertion occurs, but the LoginModules may not be called if the Subject is cached. The -Dweblogic.security.identityAssertionTTL flag can be used to affect this behavior (for example, to modify the default TTL of 5 minutes or to disable the cache by setting the flag to -1).

And the command line reference fills in some more details:

When using an Identity Assertion provider (either for an X.509 certificate or some other type of token), Subjects are cached within the server. This greatly enhances performance for servlets and EJB methods with tags as well as for other places where identity assertion is used but not cached (for example, signing and encrypting XML documents). There might be some cases where this caching violates the desired semantics.

Wrapping it all up

So to summarize:

  • SAML is cool
  • smart devices are pretty cool, but they lack a SOAP stack
  • WebLogic’s SSPI framework is cool
  • the WebLogic engineering team thought of darn near everything

Oh, and if you combine a Servlet Auth Filter, the SAML SSPI Identity Asserter, a teensy bit of code to handle cookies on the client side you can do some pretty clever things.

Got a comment or question? Let me know below!

—-
Update: After having this up a few days I had a talk with someone out of band that in effect said “you said #1 is probably not the best way, but then you went through a whole discussion about 2/3 but wound up describing #1 and saying that it was best.” So I obviously need to clarify.

What I was talking about in #1 is actually invoking a specific server. In other words login(String SamlAssertion) and have a token come back. The problem with that solution is that it’s complicated and if the token isn’t acceptable for some reason you need to know how to go back and get a fresh token.

In the rest of the post I describe sending the SAML Assertion on every request and doing “the right thing” when it comes to cookies. If the server sees the cookies and can find the associated session it won’t bother checking the SAML assertion. If you don’t have a cookie, the cookie or session is invalid or something else goes wrong then the server will go ahead and validate the SAML Assertion and establish a new one.

Hope that clears things up.


Identity Propagation from OAG to REST APIs protected by OWSM

$
0
0

Introduction

This post describes the necessary configuration for propagating an end user identity from OAG (Oracle API Gateway) to REST APIs protected by OWSM (Oracle Web Services Manager).

The requirements are:

1) Have a Java Subject established in the REST API implementation.

2) Prevent direct access to the REST API, i.e., only OAG should be able to successfully invoke it.

A recurrent question is how OWSM protects REST APIs and which types of tokens it supports when doing so.

If we look at the current OWSM (11.1.1.7) predefined policies, we notice a policy named

oracle/multi_token_rest_service_policy, described (verbatim) as:

“This policy enforces one of the following authentication policies, based on the token sent by the client:

HTTP Basic—Extracts username and password credentials from the HTTP header.

SAML 2.0 Bearer token in the HTTP header—Extracts SAML 2.0 Bearer assertion in the HTTP header.

HTTP OAM security—Verifies that the OAM agent has authenticated user and establishes identity.

SPNEGO over HTTP security—Extracts Simple and Protected GSSAPI Negotiation Mechanism (SPNEGO) Kerberos token from the HTTP header.”

In this specific use case, we are assuming the end user has already been authenticated by some other means before reaching OAG. In other words, we are assuming OAG gets some sort of token and validates the user locally, thus populating its authentication.subject.id attribute. This token OAG receives can be an OAM token, a Kerberos token, SAML token, you name it. It is matter of a design decision based on OAG’s client capabilities.

In a use case like this, it’s very unlikely that OAG will have the end user password, which eliminates the HTTP Basic header option. The remaining three are all good candidates. In this post we deal with a SAML 2.0 Bearer token in the HTTP Header. Our flow ends up being something like this: OAG Client -> “some token” -> OAG -> SAML 2.0 Bearer -> OWSM -> REST API.

We’re going to examine all necessary configuration in OAG, OWSM and in the REST API application. Buckle up, folks! And let’s do it backwards.

Main Article

REST API Web Application

Here’s my REST API Implementation in all its beauty:

package ateam.rest.impl;

import java.security.Principal;
import javax.security.auth.Subject;
import javax.ws.rs.GET;
import javax.ws.rs.Path;
import javax.ws.rs.Produces;
import weblogic.security.Security;

@Path("/carmodels")
public class CarModels {
    public CarModels() {
        super();
    }

    @GET
    @Produces("application/json")
    public String getModels() {

        Subject s = Security.getCurrentSubject();
        System.out.println("[CarModels] Principals established for the propagated user id:");
        for (Principal p : s.getPrincipals()) {
            System.out.println(p.getName());
        }

        String json = "{\"models\":[\"Nice Car\",\"Fast Car\",\"Lightweight Car\",\"Sports Car\",\"Lovely Car\",\"Family Car\"]}";
        return json;
    }
}

It prints out the user principals and gives back a list of cars. Simple as that.

There’s a need for a servlet filter (plus a filter-mapping) to intercept requests to this API. Such a filter is provided by OWSM and works hand in hand with the policy we’ve briefly talked about previously.

<filter>
    <filter-name>OWSM Security Filter</filter-name>
    <filter-class>oracle.wsm.agent.handler.servlet.SecurityFilter</filter-class>
    <init-param>
      <param-name>servlet-name</param-name>
      <param-value>ateam.rest.impl.Services</param-value>
    </init-param>
</filter>

<filter-mapping>
    <filter-name>OWSM Security Filter</filter-name>
    <servlet-name>ateam.rest.impl.Services</servlet-name>
</filter-mapping>

See that the filter mentions an specific servlet in <init-param>. This servlet simply exposes the REST API Implementation to be protected.

package ateam.rest.impl;

import javax.ws.rs.core.Application;
import javax.ws.rs.ApplicationPath;
import java.util.Set;
import java.util.HashSet;

@ApplicationPath("resources")
public class Services extends Application {
    public Set<java.lang.Class<?>> getClasses() {
        Set<java.lang.Class<?>> s = new HashSet<Class<?>>();
        s.add(CarModels.class);
        return s;
    }
}

The servlet definition completes the necessary configuration in web.xml. Notice the servlet-class is actually Jersey’s ServletContainer.

<servlet>
    <servlet-name>ateam.rest.impl.Services</servlet-name>
    <servlet-class>com.sun.jersey.spi.container.servlet.ServletContainer</servlet-class>
    <init-param>
        <param-name>javax.ws.rs.Application</param-name>
        <param-value>ateam.rest.impl.Services</param-value>
    </init-param>
    <load-on-startup>1</load-on-startup>
</servlet>

OWSM

We’re going to attach oracle/multi_token_rest_service_policy policy to all REST endpoints in the domain. But only the implementations with the setup shown previously are going to have requests intercepted.

The way to attach the policy is via wlst, as shown:

> connect('weblogic','*****','t3://<admin-server-name>:<admin-port>') 
> beginRepositorySession()
> createPolicySet('owsm-policy-set-multi-token','rest-resource','Domain("<domain-name>")')
> attachPolicySetPolicy('oracle/multi_token_rest_service_policy')
> commitRepositorySession()

This is it. Notice that createPolicySet mentions ‘rest-resource’ as the resource type. This is key here.

Before asserting the user identity in the incoming token and thus establishing the Java subject, ‘oracle/multi_token_rest_service_policy’ requires the following characteristics from the received token:

  • It has to be Base64 encoded.
  • It has to be gzipped.
  • It has to be digitally signed.

#1 and #2 requires no configuration in OWSM. But for #3 we need to import OAG’s certificate into OWSM’s keystore so that the token can be properly validated. Export OAG’s certificate into a a file using OAG Policy Studio and then import it into OWSM’s default-keystore.jks using JDK’s keytool.

> keytool -import -file ~/oag_cert.cer -keystore ./config/fmwconfig/default-keystore.jks -storepass <keystore-password> -alias oag_cert -keypass welcome1

OAG

The filter circuit in OAG has to create a SAML 2.0 Bearer assertion, sign it, gzip it, Base64 encode it and then add it to the Authorization HTTP header. Here’s the filter circuit.

 OAG_OWSM_Policy

I now highlight the most relevant aspects of each filter:

 

1) Create SOAP Envelope: this is just to please “Create SAML Authentication Assertion” filter. It expects an XML message. Here I use a SOAP envelope, but any simple XML document would work.

 Create_SOAP_Envelope

 

2) Set Authentication Subject id as DN: the point here is that OWSM policy honors the Subject NameIdentifier format in the SAML Assertion. Therefore, if format is X509SubjectName, we need to make sure to set the subject value as the user Distinguished Name (DN). If the format is unspecified, sticking with the username is enough.

 Set_Subject_ID_as_DN

Tip: You can set the format by setting the attribute authentication.subject.format. For example:

 Set_Subject_Format

3) Create SAML Authentication Assertion: the following screenshots describe the filter.

 Create_SAML_Authentication_Assertion_Details

 

Create_SAML_Authentication_Assertion_Location

 

Create_SAML_Authentication_Assertion_ConfirmationMethod

 

Create_SAML_Authentication_Assertion_Advanced

 

4) Update Message: this step is necessary just to copy the saml.assertion attribute value created in the previous step to content.body, as expected by the next filter in the chain.

 Update_Message

5) Sign SAML Assertion:

Sign_SAML_Assertion_SigningKey

Notice the Signing Key certificate. That’s the one to be exported and then imported into OWSM’s key store.

Sign_SAML_Assertion_WhatToSign

Sign_SAML_Assertion_WhereToPlace

Sign_SAML_Assertion_Advanced_Additional

Sign_SAML_Assertion_Advanced_Options

Notice “Create enveloped signature” is checked. It is required by the OWSM policy.

 

6) Retrieve SAML Assertion from Message:

Retrieve_SAML_Assertion

7) Gzip SAML Assertion (script): OAG has no filter to gzip messages. Therefore we rely on a script to do so. Notice it also Base64 encodes the message after gzipping it. The script outputs an attribute named data.base64, containing the assertion gzipped and encoded, ready to be sent.

importPackage(Packages.java.util.zip);
importPackage(Packages.java.io);
importPackage(Packages.javax.xml.transform);
importPackage(Packages.javax.xml.transform.dom);
importPackage(Packages.javax.xml.transform.stream);
importPackage(Packages.java.lang);
importPackage(Packages.oracle.security.xmlsec.util);
importPackage(Packages.com.vordel.trace);

function invoke(msg)         {        

   var data = msg.get("saml.assertion");  

   var source = new DOMSource(data.get(0).getOwnerDocument());
   var baos = new ByteArrayOutputStream();
   var result = new StreamResult(baos);
   var factory = TransformerFactory.newInstance();
   var transformer = factory.newTransformer();
   transformer.transform(source, result);
   var tokba = baos.toByteArray();

   baos = new ByteArrayOutputStream();
   var gzos = new GZIPOutputStream(baos);
   gzos.write(tokba);
   gzos.flush();
   gzos.finish();
   var gzdata = baos.toByteArray();

   var b64 = new Base64(); 
   b64.setUseLineBreaks(false);
   var b64tok = b64.encode(gzdata);

   msg.put("data.base64", b64tok);
   return true;         
}

8) Add SAML Assertion to HTTP Header: the Authorization header mechanism must be set to “oit”, as shown:

 Add_SAML_Assertion_to_HTTP_Header

9) Connect to Car Models Service:

 Connect_to_CarModels_Service

At the end, this is what a good assertion would look like:

<?xml version="1.0"?>
<saml:Assertion xmlns:saml="urn:oasis:names:tc:SAML:2.0:assertion" ID="Id-cffa4f53f9490000090000004f131aad-1" IssueInstant="2014-04-17T16:01:19Z" Version="2.0">
  <saml:Issuer Format="urn:oasis:names:tc:SAML:1.1:nameid-format:X509SubjectName">www.oracle.com</saml:Issuer>
    <dsig:Signature xmlns:dsig="http://www.w3.org/2000/09/xmldsig#" Id="Id-0001397750479781-ffffffffd55f69c1-1">
      <dsig:SignedInfo>
        <dsig:CanonicalizationMethod Algorithm="http://www.w3.org/2001/10/xml-exc-c14n#"/>
        <dsig:SignatureMethod Algorithm="http://www.w3.org/2000/09/xmldsig#rsa-sha1"/>
        <dsig:Reference URI="#Id-cffa4f53f9490000090000004f131aad-1">
          <dsig:Transforms>
            <dsig:Transform Algorithm="http://www.w3.org/2000/09/xmldsig#enveloped-signature"/>
            <dsig:Transform Algorithm="http://www.w3.org/2001/10/xml-exc-c14n#"/>
          </dsig:Transforms>
          <dsig:DigestMethod Algorithm="http://www.w3.org/2000/09/xmldsig#sha1"/>
          <dsig:DigestValue>87KiwbLN11S3qwJw23Zm0Odh9QQ=</dsig:DigestValue>
        </dsig:Reference>
      </dsig:SignedInfo>
      <dsig:SignatureValue>UO6S7++uxuqqLPl4cege7vmZpQ1q6MXL51s/e/fDd74aZdrEOx+G1tqA4YQtVQIh
fTuOcd1CtOyEUqOLNy9F4e87Ld/cqNcr8iWGlokPEPP153r19MIaWSYDslYq10xe
cArsGeayx0PpWjXo0VSH+u26grsTWIY+YATuU7BcKnqrrWFjmRxHAK/towXtuiPL
NtNYVgI6dPXVzJ+2lGSiZKBDBFoV9zUFE98kU0f050e3mq2x2BwvQ7MQUkPYyadt
b+Ifn0Hcr77Fp7FYfM0gPAMt3X0Dm5qsrEo5WS47RkWDq6EEdQx9HFEQJMLdwABL
xC8gNTETalZs73xUUQu2CA==</dsig:SignatureValue>
      <dsig:KeyInfo xmlns:dsig="http://www.w3.org/2000/09/xmldsig#" Id="Id-0001397750479781-ffffffffd55f69c1-2">
        <dsig:X509Data>
          <dsig:X509Certificate>
MIICtjCCAZ4CBgE9RZO/rjANBgkqhkiG9w0BAQUFADAaMRgwFgYDVQQDEw9TYW1wbGVzIFRlc3Qg
Q0EwHhcNMTMwMzA3MTU1ODAwWhcNMzcwMTEwMTA1NjAwWjAjMSEwHwYDVQQDExhTYW1wbGVzIFRl
c3QgQ2VydGlmaWNhdGUwggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQClSoXx8QPLrHMS
Ff/5m3uLrDhxHycPYkamDCouu89mSKhD7aEZy3QS0mvZHvY2N1TmuQcdTuOgSE5qyT20mBEUVBnU
1y4WLQqM5fKu0TmIAajtYWTOdTnSuwR3f9W4poSwRMDNkUb8gPiXZNHZiyzriRMus29ER61eYAdr
XFlv5emXqi2ZK2bpBdtO6Q641TM9kUWB4ZyMqkGtRys9m2hNaXVR8e7r2WUrA9LEx3bRpku/OodI
GS6Qy0C2vueHDrdLYhYGKfNIllagEXY+dBQI8t2qH7rXBmr16lYyKK8VYJqeud9/NCAxD78vzOLY
0q6WaisVCa6FE/KpgpNF8sbZAgMBAAEwDQYJKoZIhvcNAQEFBQADggEBAH3W3yCTSORmIq5uhNTC
Zvd6wPz+zNXT19GqKgeDgjYMJj/Bl8czRO9YZo5MslwHILLgDVdz+ux4nSS3mCZ+PQGO42p/6o6n
IQ31yGzfYjTZ/1+1//CWyVtEhuSv5oFE+Le5mvkf1kNUrW4//qOXtfwXy/Hq09E9eaXlnBxUTHls
cQkpfQW5bi5Go7FDUNpW5EXUgXrQ96qKWMMK7i1hm7r5o6TldxCq5ANlPo/sObFNooQDkBWSKJ5t
GTtPiXO8kqYWdNBvnSRDk1Qqsn6fdFz485WB0e0pqWg2SuZa1026gIqtQPekJDQzTm0qvAnh/Aoh
oKs1dNQxruBf+MFLisw=
        </dsig:X509Certificate>
      </dsig:X509Data>
    </dsig:KeyInfo>
  </dsig:Signature>
  <saml:Subject>
    <saml:NameID Format="urn:oasis:names:tc:SAML:1.1:nameid-format:X509SubjectName">cn=jane,cn=Users,dc=us,dc=oracle,dc=com</saml:NameID>
    <saml:SubjectConfirmation Method="urn:oasis:names:tc:SAML:2.0:cm:bearer"/>
  </saml:Subject>
  <saml:Conditions NotBefore="2014-04-17T16:01:18Z" NotOnOrAfter="2014-04-17T16:06:18Z"/>
  <saml:AuthnStatement AuthnInstant="2014-04-17T16:01:19Z">
    <saml:AuthnContext>
      <saml:AuthnContextClassRef>urn:oasis:names:tc:SAML:2.0:ac:classes:Password</saml:AuthnContextClassRef>
    </saml:AuthnContext>
  </saml:AuthnStatement>
</saml:Assertion>

Wrapping up…

With this configuration in place, at runtime the REST API implementation writes the following in the server’s log for a user authenticated as jane:

[CarModels] Principals established for the propagated user id:
jane

And any SAML assertion not signed by OAG is going to be promptly rejected by OWSM.

See you next time!

Integrating with Fusion Applications using SOAP web services and REST APIs (Part 1 of 2)

$
0
0

Fusion Applications provides several types of interfaces to facilitate integration with other applications within the enterprise and on the cloud.As one of the key integration interfaces, Fusion Applications (FA) supports SOAP services based integration, both inbound and outbound. At this point FA doesn’t provide REST API’s but it is planned for a future release. It is however possible to invoke external REST APIs from FA which we will discuss. Oracle continues to invest in improving both SOAP and REST based connectivity. The content in this blog is based on features that were available at the time of writing it.

In this two part blog, I will cover the following topics briefly.

1.  Invoking FA SOAP web services from external applications

a. Identifying the FA SOAP web service to be invoked
b. Sample invocation from an external application
c. Techniques to invoke FA services from an ADF application

2.  Invoking external SOAP Web Services from FA (covered in Part 2)

3.  Invoking external REST APIs from FA (covered in Part 2)

I’ll touch upon some basics, so that you can quickly build a few SOAP/REST interactions with FA. If you do not already have access to an FA instance (on-premise or SaaS), you can request for a free 30 day trial of the Oracle Sales Cloud using http://cloud.oracle.com

1. Invoking FA SOAP web services from external applications

There are two main types of services that FA exposes

–          ADF Services – These services allow you to perform CRUD operations on Fusion business objects. For example, Sales Party Service, Opportunity Service etc. Using these services you can typically perform operations such as get, find, create, delete, update etc on FA objects.These services are typically useful for UI driven integrations such as looking up FA information from external application UIs, using third party Interfaces to create/update data in FA. They are also used in non-UI driven integration uses cases such as initial upload of business or setup data, synchronizing data with an external systems, etc.

–          Composite Services – These services involve more logic than CRUD and often involving human workflows, rules etc. These services perform a business function such as Get Orchestration Order Service and are used when building larger process based integrations with external systems.These services are usually asynchronous in nature and are not typically used for UI integration patterns.

1a. Identifying the FA SOAP web service to be invoked

All FA web service metadata is available through an OER instance (Oracle Enterprise Repository) which is publicly available via http://fusionappsoer.oracle.com. This is the starting point for you to discover the services that you are going to work with. You do not need to own a FA account to browse the services using the above UI

You can use the search area on the left to narrow down your search to what you are looking for. For example, you can choose the type as by ADF Services or Composite, you can narrow your search to a specific FA version, Product Family etc.

image001

Once you hit ‘Search’ you can see all the details of the service. There is a bunch of details for each service, but to get started check for the following

–          Make sure that under the Taxonomy Tab, Keywords contains ‘EXTERNAL’.

–          Using the Details tab, identify for the operation that you are planning to use.

–          Using the Details tab, view the Abstract WSDL URL (scroll down to the end to find it). At this point you need an actual FA instance (on premise or on the cloud) to get started. Once you have an actual FA instance, you can view the concrete WSDL by using URL mentioned as ‘Service Path’.

The Composite Services do not provide a Service Path and are not candidates for SaaS integration. On-Premise Customers have access to the SOA servers where these composites are deployed. Using the Component Name and SAR location under the overview tab on fusionappsoer.oracle.com, customers can identify the composite in the Oracle Enterprise manager. Using the information on Enterprise Manager, customers can build integrations with other On-Premise assets such as Siebel, E-Business Suite etc. These services are usually long running asynchronous processes and often involve human workflow, configurations and monitoring.

Oracle Fusion Distributed Order Orchestration is a good example where Oracle pre-built integrations with Siebel using these composite services. More information can be found at http://www.oracle.com/us/products/applications/fusion/supply-chain-management/fusion-distr-order-orch-ds-1555648.pdf

In the rest of this post, we will focus on building simple UI centric use cases using ADF services.

1b. Sample invocation of ADF services from an external application

Take a closer look at the concrete WSDL. Couple of items to notice here

–          Most operations have synchronous as well as asynchronous variations.

–          Notice the security policy entry under binding. The synchronous operations are associated usually with wss11_saml_or_username_token_with_message_protection_service_policy

image003To being with, simply invoke these web services using a SOAP UI client. The following blog entry provides detailed explanation and step by step instructions to do this – http://www.ateam-oracle.com/using-soapui-for-secure-asynchronous-web-service-invocations-in-fusion-applications.

When dealing with external applications, ensure that the appropriate SSL certificates are imported similar to what is being done with SOAP UI example in the link above. Presently FA Sales Cloud does not provide an interface to import SSL certifications limiting certain integration abilities especially when self-signed certificates are used.

You may also have noticed is the size of the response payload from the FA services is very large. Especially with Find operations, the results may be too large to be meaningful. Imagine that you want to build a simple search feature in your custom application where you want the search results to display a valid set of Sales Accounts – as defined in FA. You may want the results to display as a list of Accounts (organizations), with information just four key pieces of information – Organization Name, Address, City and Country.

image005If the service invocation with find operation results in providing ALL details for each Sales Party, then it makes it hard for the custom application to handle the output. There is an unnecessary performance overhead as well. Fortunately, the Find operation provides a set of parameters in the input payload that can be used to control the output.

For example, the XML request payload in theSOAP UI blog linked earlier, can be modified as follows

Request:

<soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/">
   <soapenv:Body>
      <typ:findSalesParty xmlns:typ="http://xmlns.oracle.com/apps/crmCommon/salesParties/salesPartiesService/types/">
         <typ:findCriteria xmlns:typ1="http://xmlns.oracle.com/adf/svc/types/">
            <typ1:fetchStart>0</typ1:fetchStart>
            <typ1:fetchSize>10</typ1:fetchSize>
            <typ1:filter>
               <typ1:group>
               <typ1:conjunction>And</typ1:conjunction>
                  <typ1:item>
                     <typ1:attribute>PartyName</typ1:attribute>
                     <typ1:operator>STARTSWITH</typ1:operator>
                     <typ1:value>Art</typ1:value>
                  </typ1:item>
                  <typ1:item>
                     <typ1:attribute>PartyType</typ1:attribute>
                     <typ1:operator>=</typ1:operator>
                     <typ1:value>ORGANIZATION</typ1:value>
                  </typ1:item>	
               </typ1:group>
            </typ1:filter>
            <typ1:findAttribute>PartyName</typ1:findAttribute>
            <typ1:findAttribute>OrganizationParty</typ1:findAttribute>
            <typ1:childFindCriteria>
            		<typ1:childAttrName>OrganizationParty</typ1:childAttrName>
            		<typ1:findAttribute>Address1</typ1:findAttribute>
            		<typ1:findAttribute>City</typ1:findAttribute>
            		<typ1:findAttribute>Country</typ1:findAttribute>
            </typ1:childFindCriteria>
         </typ:findCriteria>
      </typ:findSalesParty>
   </soapenv:Body>
</soapenv:Envelope>

The above request translates to – give me the top 10 hits for Organizations whose names start with “Art” and give me only the Name, Address, City and Country details of these Organizations. When executed against my FA instance, the response displays something like below

Response:

<env:Envelope ..>
   <env:Header>..</env:Header>
   <env:Body>
      <ns0:findSalesPartyResponse ..">
         <ns2:result ..>
            <ns1:PartyName>Artemis International Solutions Corp</ns1:PartyName>
            <ns1:OrganizationParty ..>
               <ns3:Address1>1000 Louisiana</ns3:Address1>
               <ns3:Country>US</ns3:Country>
               <ns3:City>Houston</ns3:City>
            </ns1:OrganizationParty>
         </ns2:result>
         <ns2:result ..>
            <ns1:PartyName>Artisan Press Ltd</ns1:PartyName>
            <ns1:OrganizationParty ..>
               <ns3:Address1>4  BOSTON ROAD</ns3:Address1>
               <ns3:Country>GB</ns3:Country>
               <ns3:City>Leeds</ns3:City>
            </ns1:OrganizationParty>
         </ns2:result>
         <ns2:result ..>
            <ns1:PartyName>Artwise Messe</ns1:PartyName>
            <ns1:OrganizationParty..>
               <ns3:Address1>Bergengrünstr. 9</ns3:Address1>
               <ns3:Country>DE</ns3:Country>
               <ns3:City>Essen</ns3:City>
            </ns1:OrganizationParty>
         </ns2:result>
      </ns0:findSalesPartyResponse>
   </env:Body>
</env:Envelope>

Note that the response is much smaller and manageable. They key here is to use the findAttribute to control the output. In my system, the response time for this request without the findAttribute was 2000 to 2500ms. With the findAttribute the response time improved to 500ms.

In general, you can play with the findCriteria to exactly define what you want to search and what you want in the output. Think of it as the web service equivalent of the find functionality that you see in many enterprise applications. This is a powerful feature and is present only for the Find Operation.

Another key point to note about ADF web services is that they encapsulate details originating from more than one individual object in FA. For example in the case of SalesParty, in addition to the basic details of the Sales Party, the ADF service provides dependent information from OrganizationParty and SalesAccount objects. Each of these objects again bring in dependencies that they need.

This allows the consumers of the webservice, to get all relavant information in one go, without having to make multiple calls to retrieve related information. This makes ADF services, granular and right-sized business services, which is a corner store to building robust web services based integration architectures.

1c. Techniques to invoke ADF FA services from an ADF based external application

Let us see some techniques when invoking FA services from a custom built ADF application. This is useful when you have standalone ADF based application in your organization, or if you are building ADF extensions to FA, say on Oracle Java Cloud Service.

Some parts of these instructions can be useful when invoking FA web services from any J2EE based application.

There are several ways to invoke FA web services from ADF. We will look at two simple options

–          Using ADF Web Service Data Control
–          Using Web Service Proxy

ADF Web Service data control is a simple and declarative approach to invoke a web service from ADF pages. The web service data control is particularly useful when the end objective is to simply display the results of a web service invocation on an ADF Page.

The Web Service Proxy option provides more flexibility and control, and can be used in conjunction with a programmatic VO or a bean data control.

The blog entry https://blogs.oracle.com/jdevotnharvest/entry/which_option_to_choose_for provides guidelines to pick a suitable approach when accessing web services in ADF.

If you have JDeveloper 11g installed in your system, you can simply follow the steps below to invoke your FA on premise or SaaS instance.

Using ADF Web Service Data Control

  • Create an Application of type Fusion Web Application. I named mine ‘SalesPartySample’. Ensure that you choose and shuttle ‘Web Services’ when on the Model project creation page. Accept all other defaults in the Wizard

image009

  • Under the Model Project, create a web service data control and provide the WSDL for the SalesParty web service. Select the getSalesParty Operation. Note that you can also include custom HTTP Headers. We won’t be using it in this example but it is one way of authenticating against an FA service

image013image015

  • In the Endpoint authentication, provide the username/password

image017

  • At this point you should see something like this in the data controls. Notice the Parameters and results

image019

  • Now we will use this data control from a simple UI. In the View Control Project, create a new JSF page. I created mine as index.jspx
  • Drag and drop the partyId parameter from the Data Control into the jspx. Choose the display type as Text Input w/ Label. Underneath it, drag and drop the getSalesParty(Long). Choose to display it as an ADF button. Underneath it, drag and drop the PartyName as Text Output. Drag and drop Address1, City and Country from under Organization Party. Your jspx should look like below

image021

  • Simply right click your .jspx and Run.
  • In this page, plug in a Party ID. I entered the Party ID for ‘Artisan Press Ltd’ that I got from the previous SOAP UI exercise

image023

  • It’s that simple! No coding effort!! Of course the UI can be made much better looking and there could be more complex usecases such as using custom HTTP Headers and this will require some amount of coding.
  • At this point if you face a SSL related error, it is because your Jdev keystore doenst have the necessary SSL certificates imported.To fix this, navigate to your FA instance >Export the SSL certificate as .pem using your favorite browser (plenty of instruction on the internet). Import it to your JDev keystore as follows

C:\Oracle\Middleware\jdk160_24\bin>keytool -importcert -alias fusionapps -file <locationtotheexportedPEM>\mypk.pem -trustcacerts -keystore C:\Oracle\Middleware\wlserver_10.3\server\lib\DemoTrust.jks -storepass DemoTrustKeyStorePassPhrase

Using Web Service Proxy

Next we will look at using a Web Service Proxy to invoke the Sales Party Service. This time we will not use the get operation but use the find operation. We will build the FindCriteria that we used in SOAPUI using Java.

  • Click New and select Web Service Proxy

image027

  • Choose JAX WS style
  • Enter the package name as oracle.sample.salesparty.proxy and oracle.sample.salesparty.proxy.types
  • Unselect generate as async and subsequently select Don’t Generate Async

image029

  • You should see SalesPartyServiceSoapHttpPortClient.java with the text, “Add your code to call the desired methods”
  • It is a good practice to not use the above Proxy Client directly. This is because the when the proxy gets regenerated the changes made to the above client changes will be be lost. Instead it is good to create a separate facade. The facade also allows you to control the data type of the response
  • Create a facade like below. The idea is to build the same XML request payload using ‘findCriteria’ like we built in the earlier example. The getSalesPartyList method takes only “startsWith” as input from the user, builds the findcriteria and executes the service, returning a list of Sales Party records
public class SalesPartyFacade {
          private SalesPartyService_Service salesPartyService_Service;
          public List<SalesParty> getSalesPartyList(String startsWith)
          throws ServiceException
        {
          List<SalesParty> SalesParties;
          FindCriteria findCriteria = createFindCriteria(startsWith);
 
          SecurityPolicyFeature[] securityFeatures =
          new SecurityPolicyFeature[] { new SecurityPolicyFeature("oracle/wss_username_token_over_ssl_client_policy") };
          
          salesPartyService_Service = new SalesPartyService_Service();
          SalesPartyService salesPartyService = salesPartyService_Service.getSalesPartyServiceSoapHttpPort(securityFeatures);

          
          WSBindingProvider wsbp = (WSBindingProvider)salesPartyService;
          wsbp.getRequestContext().put(BindingProvider.USERNAME_PROPERTY,"User1");
          wsbp.getRequestContext().put(BindingProvider.PASSWORD_PROPERTY,"Passwd1");

          FindSalesParty fSalesParty= new FindSalesParty();
          fSalesParty.setFindCriteria(findCriteria);
          SalesParties = salesPartyService.findSalesParty(fSalesParty).getResult();

          return SalesParties;
        }

    private static FindCriteria createFindCriteria(String startsWith)
    {
      FindCriteria findCriteria = new FindCriteria();
      ChildFindCriteria childFindCriteria = new ChildFindCriteria();
      findCriteria.setFetchStart(0);
      findCriteria.setFetchSize(10);

      ViewCriteria filter = new ViewCriteria();
      ViewCriteriaRow group1 = new ViewCriteriaRow();
      ViewCriteriaItem item1 = new ViewCriteriaItem();
      item1.setAttribute("PartyName");
      item1.setOperator("STARTSWITH");
      item1.getValue().add(startsWith);
      
      ViewCriteriaItem item2 = new ViewCriteriaItem();
      item2.setAttribute("PartyType");
      item2.setOperator("=");
      item2.getValue().add("ORGANIZATION");
      

      group1.getItem().add(item1);
      group1.getItem().add(item2);
      group1.setConjunction(Conjunction.AND);
      
      filter.getGroup().add(group1);
      findCriteria.setFilter(filter);
      
      findCriteria.getFindAttribute().add("PartyName");
      findCriteria.getFindAttribute().add("OrganizationParty");
     
/*    childFindCriteria.setChildAttrName("OrganizationParty");
      childFindCriteria.getFindAttribute().add("Address1");
      childFindCriteria.getFindAttribute().add("City");
      childFindCriteria.getFindAttribute().add("Country");
      findCriteria.getChildFindCriteria().add(childFindCriteria);
*/  
      return findCriteria;
    }

}
  • Now, you can write a simple test class to test this facade.
        String filter = "Art"; 
        SalesPartyFacade spf = new SalesPartyFacade();
        List<SalesParty> salesparties = spf.getSalesPartyList(filter);
        for (SalesParty sp: salesparties)
        {System.out.print("Party Name = " + sp.getPartyName().getValue()+"\n");
         System.out.print("Address1 = " + sp.getOrganizationParty().get(0).getAddress1().getValue()+"\n");
         System.out.print("City = " + sp.getOrganizationParty().get(0).getCity().getValue()+"\n");
         System.out.print("Country = " + sp.getOrganizationParty().get(0).getCountry().getValue()+"\n");
         System.out.println("\n\n");   }
  • This facade can now be used in any manner to invoke the Sales Party Service from your UI.

This concludes part 1 of this two part blog. In part 2 of this post I’ll be talking about invoking external SOAP and REST services from Fusion Applications.

 

 

 

Integrating with Fusion Applications using SOAP web services and REST APIs (Part 2 of 2)

$
0
0

This is part 2 of a two-part blog post that covers SOAP and REST integration with Fusion Applications.

In part 1, I covered the topic of invoking FA SOAP web services from external applications. In this part, I will cover the topic of invoking external SOAP and REST services from FA.

Before discussing the details, here is the outline of this post.

1.  Invoking FA SOAP web services from external applications (covered in part 1)

2.  Invoking external SOAP Web Services from FA

3.  Invoking external REST APIs from FA

 

2. Invoking external SOAP Web Services from FA

FA CRM Application Composer allows you to invoke external web services from within Fusion Applications. This is a feature only available in FA CRM and can be used in both On Premise and Cloud instances. Other FA products such as HCM, Financials, do not presently support invocation of external web services in a declarative manner.

In your CRM instance, use the Navigator to navigate to Tools > Customization >Application Composer. If you cannot see this navigation, you may be missing privileges. Contact your administrator.

Once in Application Composer, select your Application (say Sales). From Common Setup select Web Services. You will see a screen similar to the one below where you can enter your wsdl and choose the operation and security.

 

image031

Once you create the web service you can use it to invoke the web services from your UI elements. For example this web service takes in two inputs, Card Type and Card Number and responds with whether the card number is valid for the given card type.

For this scenario we could create two new fields under a standard or custom object as shown in the Application Composer. These fields will be used for capturing card type and card number as inputs from the user. A Third field can be added to provide the response. Using Groovy construct, this third field can be made to invoke the web service for the given two input values.

More information is available at the Oracle official documentation here

Couple of additional points to notice here

–          When using SaaS (Oracle Sales Cloud) you do not have a way to import SSL certificates. Sales Cloud comes with a set of CA certificates already imported, but cannot honor self signed certificates. You would have to contact Oracle Support to do that.

–          The three security options in the figure above translates to invoking web services with one of the following security policies

o   wss_username_token_over_ssl_service_policy

o   wss_username_token_with_message_protection_service_policy

o   wss11_saml_token_with_message_protection_client_policy

o   wss_saml_token_bearer_over_ssl_client_policy

o   wss11_saml _token_ identity_switch_with_message_protection_service_policy

External web Services protected with other policies cannot be accessed at this point. Oracle is working on supporting more policies.

 

3. Invoking external REST APIs from FA

Presently there is no declarative way of registering external REST APIs in FA similar to the support for SOAP web services. At this point the only option is to do this programmatically using Groovy.

To implement this, we will be using the CRM application composer but this time we will be creating a Global Functions.

image032

Provide a name and description for the global function, define the Return Type etc.  In the Function Body, we will be using Groovy code to invoke REST APIs.

image034

Below is the sample groovy code for a simple invocation of a REST API that provides the details of a specific customer. In this case the response is returned in xml format and so a simple xml parser is employed to parse and return a string value.

At this point FA supports at least two ways to invoke REST APIs using groovy, namely using HTTPClient or using HTTPUrlConnection. There are certain advantages of each approach, but in general HTTPClient is preferred for performance reasons. The code samples below will illustrate both options.

Using HTTPClient:

def authString = "User1:Welcome1".getBytes().encodeBase64().toString()
def url = new URL("http://test.oracle.com/xxx/api/v1/customers/1234")
org.apache.http.impl.client.DefaultHttpClient httpClient = new org.apache.http.impl.client.DefaultHttpClient()
org.apache.http.client.methods.HttpGet getRequest= new org.apache.http.client.methods.HttpGet(url);
getRequest.addHeader("Accept", "application/xml");
getRequest.addHeader("Authorization", "Basic " + authString);

//actual invocation of the REST API
org.apache.http.HttpResponse response = httpClient.execute(getRequest);
def status = response.getStatusLine().getStatusCode();

//Retrieving response
def returnMessage = ""
def customerName = ""
if (status>= 300) {
  throw new org.apache.http.client.ClientProtocolException("Unexpected response status: " + status)}
org.apache.http.HttpEntity responseEntity = response.getEntity();
if (responseEntity != null) {
            returnMessage= org.apache.http.util.EntityUtils.toString(responseEntity);
        }

def customers = new XmlParser().parseText(returnMessage)
customerName = "Customer Name: " + customers.details.name.text()
return customerName

The sample above illustrates a GET operation. Similar to org.apache.http.client.methods.HttpGet, POST and PUT can be achieved through HttpPost and HttpPut respectively.

Using HTTPUrlConnection:

def authString = "User1:Welcome1".getBytes().encodeBase64().toString()
def url = new URL("http://test.oracle.com/xxx/api/v1/customers/1234")
def connection = url.openConnection()
connection.setRequestMethod("GET")
connection.setRequestProperty("Authorization", "Basic " + authString);
connection.setRequestProperty("Accept", "application/xml");
connection.connect()

//Retrieving response
def returnMessage = ""
def customerName = ""
if (connection.responseCode == 200 || connection.responseCode == 201){
returnMessage = connection.content.text
  
//Parsing the response  
def customers = new XmlParser().parseText(returnMessage)
customerName = "Customer Name: " + customers.details.name.text()
} else { 
customerName = "Error Connecting to: " + url
}
return customerName

The above samples illustrate simple XML parsing. When using JSON, the results can be parsed as follows

def returnMessage = '[{"id":1234, "name":"John"}]'   
groovy.json.JsonSlurper js = new groovy.json.JsonSlurper()
// Parse the response as Map
def map = js.parseText( returnMessage )
   
def customerName=map[0].name
return customerName

Once the global function is created with the necessary groovy code, this global function can be used in any logic for existing or new fields of an FA page.

The above examples hopefully helped you get a better understanding of Fusion Applications’ SOAP and REST capabilities. Oracle is continuously investing in this area. So stay tuned for more updates!

Aggregating WCS Content as a Service

$
0
0

Often it is necessary to exchange aggregated content from one system to another. By “aggregated” I mean that the complexity of the underlying Model is partially hidden from the caller and that the system doing the serving is free to make model changes without having to notify the caller. Additionally, the system providing the service will generally “clean up” and denormalize the data as appropriate for the service. As such, the response might include various bits of content from many assets (known only by the system providing the service), not just the one asset being requested. Being a mature and flexible CMS system, Webcenter Sites is ideally suited to provide such an abstraction when integrating with other applications.

Let’s take for a trivial example a simple flex Article asset that uses a CKEdit attribute as a container for its bodytext. And let’s say we want WCS to do nothing more than provide the plain-jane content for this Article to another system. OOTB, we could export publish the content as pure XML, but the problem is that if we took this approach, then the content exported for the CKEdit attribute would not be able to include the values of any embed assets’ data within the exported content because, when exported as XML, such data is not evaluated and only the embedded assetid references are provided. The following screenshot shows an all-to-typical use of the CKEdit attribute: to embed another asset inside the body field (in this case an image asset, but of course it could be anything):

Screen Shot 2014-06-09 at 1.29.36 PM

And below is an example of the above content exported as XML using XML Export. Note the (basically unusable) embedded asset reference inside the body.

<?xml version="1.0"?>
<document>
<asset id="1330881053255" type="AVIArticle" subtype="Article">
<attribute name="Attribute_headline"><string value="Agility Drills"/></attribute>
<attribute name="flextemplateid"><assetreference type="ContentDef" value="1327351718518"/></attribute>
<attribute name="Attribute_author"><string value="KEIRAN CONROY"/></attribute>
<attribute name="fw_uid"><string value="76a7c0ec-64af-4362-9644-ec4fa8675b88"/></attribute>
<attribute name="updateddate"><date value="2014-06-02 20:10:08.708"/></attribute>
<attribute name="status"><string value="ED"/></attribute>
<attribute name="subtype"><string value="Article"/></attribute>
<attribute name="updatedby"><string value="fwadmin"/></attribute>
<attribute name="createdby"><string value="fwadmin"/></attribute>
<attribute name="Group_Category"><assetreference type="ArticleCategory" value="1327351718595"/></attribute>
<attribute name="template"><string value="ArticleLayout"/></attribute>
<attribute name="createddate"><date value="2012-04-27 08:09:29.168"/></attribute>
<attribute name="Attribute_body"><string value="<p class="ckbody">
	Set two cones 10 yards apart and do the following:<br />
	<br />
	Start by facing forward in a staggered stance. On &quot;go,&quot; sprint to the opposite cone. At the cone, regain control, stop as quickly as possible and backpedal to the start.</p>
<p class="ckbody">
	At the start, turn your hips, plant your outside foot and begin side shuffling back to the far cone. When you reach the cone, plant your outside foot again and shuffle back to the starting cone.</p>
<p class="ckbody">
	Repeat the same sequence using carioca footwork (side step, crossover step, side step, crossover behind). After returning to the cone on the last carioca step, plant and sprint past the last cone.</p>
<p class="ckbody">
	The whole drill (seven changes) should be completed in 20 seconds or less depending on the age of the athlete. Rest 30-45 seconds and repeat. See how quickly you can change directions, as well as movements.</p>
<p class="ckbody">
	<span id="_CSEMBEDTYPE_=inclusion&amp;_PAGENAME_=avisports%2FAVIImage%2FSummary&amp;_ADDITIONALPARAMS_=thumbnail-field%3D&amp;_cid_=1327351718361&amp;_c_=AVIImage&amp;_deps_=exists"><i>[Asset Included(Id:1327351718361;Type:AVIImage)]</i></span></p>
<p class="ckbody">
	&nbsp;</p>
"/></attribute>
<attribute name="name"><string value="Agility Drills"/></attribute>
<attribute name="Attribute_postDate"><date value="2012-03-05 16:08:13.0"/></attribute>
<attribute name="Attribute_relatedImage"><assetreference type="AVIImage" value="1327351718361"/></attribute>
<attribute name="Publist"><array>
<integer value="1322052581735"/></array>
</attribute>
<attribute name="Attribute_subheadline"><string value="Mastering Changes of Direction "/></attribute>
</asset>
</document>

Such a use of XML export would be an example of a “bad” service — basically forcing the calling application to both know that there might be embedded assets nested within a single attribute (a key feature of WCS, btw) but also forcing this calling application to be responsible for interpreting such embedding and for making subsequent calls to fetch the embedded content, leading ultimately to a chatty application.

Fortunately, a straightforward solution exists: create a simple WCS JSP Template that “evaluates/renders” the nested/aggregated content using the <render:stream> tag. The result will be that all embedded/nested content will be converted to HTML values rather than references with the benefit that the underlying aggregation defined by the CKEdit attribute is now hidden from the caller. Given that there is a normal WCS Template that is doing the rendering, it is trivial to preview the result as there will be a URL that can request the result. Note: for the purposes of this blog I recommend you specify this Template to be “Element defines a whole HTML page and can be called externally” as shown below:

Screen Shot 2014-06-09 at 1.37.27 PM

Here is the simple JSP code that does the CKEdit attribute aggregation (using AVISports coding pattern):

<%@ taglib prefix="cs" uri="futuretense_cs/ftcs1_0.tld"
%><%@ taglib prefix="render" uri="futuretense_cs/render.tld"
%><%@ taglib prefix="insite" uri="futuretense_cs/insite.tld"
%><%@ taglib prefix="ics" uri="futuretense_cs/ics.tld"
%><cs:ftcs>
<render:logdep cid='<%=ics.GetVar("tid")%>' c="Template"/>
<ics:callelement element="avisports/getdata">
	<ics:argument name="attributes" value="body" /> 
</ics:callelement>

<insite:edit field="body" value="${asset.body}" editor="ckeditor" params="{noValueIndicator: 'Enter body ', width: '627px', height: '500px', toolbar: 'Article', customConfig: '../avisports/ckeditor/config.js'}"/>

</cs:ftcs>

Using the following Jumpstart Kit preview URL: http://localhost:9080/cs?pagename=avisports/AVIArticle/AggregatedContent&cid=1330881053255&c=AVIArticle, the (now aggregated) rendering of the body result is:

Screen Shot 2014-06-09 at 1.49.41 PM

So far so good. The above only renders the aggregated part of the content. Most likely we would also need to expose the rest of the content’s metadata for the calling application (e.g. name, description, locale, etc,). Rather than laboriously add individual metadata calls as XML entities, it would be much easier to wrap the above Template in a truly generic/reusable code element that in essence “wrapped” the previous HTML in XML. The following code, packaged as a SiteEntry Wrapper whose cs.contenttype=application/xml (set in the resargs for the SiteEntry) does precisely that:

Screen Shot 2014-06-09 at 1.22.03 PM

<%@   taglib prefix="cs"     uri="futuretense_cs/ftcs1_0.tld"
%><%@ taglib prefix="asset"  uri="futuretense_cs/asset.tld"
%><%@ taglib prefix="ics"    uri="futuretense_cs/ics.tld"
%><%@ taglib prefix="render" uri="futuretense_cs/render.tld"
%><%@ page import="COM.FutureTense.Interfaces.*,
                   COM.FutureTense.Util.ftMessage,
                   COM.FutureTense.Util.ftErrors,
                   java.io.*"
%><cs:ftcs><%--

WRAPPER: XMLExportWrapper
INPUT:   a request for an asset's template during static publishing (must specify using this wrapper in addition to the Template)
OUTPUT:  The asset's template wrapped in XML metadata for consumption by 3rd party apps that need compositional content from WCS
NOTE:    be sure to specify cs.contenttype=application/xml in the resargs for this SiteEntry

--%><%
%><ics:if condition='<%=ics.GetVar("seid")!=null%>'><ics:then><render:logdep cid='<%=ics.GetVar("seid")%>' c="SiteEntry"/></ics:then></ics:if><%
%><ics:if condition='<%=ics.GetVar("eid")!=null%>'><ics:then><render:logdep cid='<%=ics.GetVar("eid")%>' c="CSElement"/></ics:then></ics:if><%

%><asset:load    name="anAsset" type='<%=ics.GetVar("c")%>' objectid='<%=ics.GetVar("cid")%>' editable="true" /><%
%><asset:scatter name="anAsset" prefix="exp" exclude="true" /><%
%><asset:export  name="anAsset" prefix="exp" output="assetXML" writeattrvalue="true" /><%// use false so that the exported XML has same format as below %><%

%><%

	String assetXML   = ics.GetVar("assetXML");
	String beginCDATA = "<![CDATA[";
	String endCDATA   = "]]>";
	String temp1      = assetXML.replace("</asset>", "");
	String temp2      = temp1.replace("</document>", "");

%><%= temp2 %><% // ****** this outputs the default WCS XML metadata of the asset, minus the closing tags. Must be the first line of output! %>

<attribute name="AggregatedHTML"><% // ****** this outputs the aggregated (i.e. Compositional) HTML %>
	<string>
		<%= beginCDATA %>
		<render:satellitepage pagename='<%=ics.GetVar("childpagename")%>' packedargs='<%=ics.GetVar("packedargs")%>'>
				<render:argument name='c'      value='<%=ics.GetVar("c")%>'/>
				<render:argument name='cid'    value='<%=ics.GetVar("cid")%>'/>
				<render:argument name='p'      value='<%=ics.GetVar("p")%>' />
				<render:argument name='d'      value='<%=ics.GetVar("d")%>' />
				<render:argument name="locale" value='<%=ics.GetSSVar("preferred_locale")%>'/>
		</render:satellitepage>
		<%= endCDATA %>
	</string>
</attribute>

</asset><% // be sure to close the XML tags %>
</document><% // be sure to close the XML tags %>
</cs:ftcs>

Using the following Jumpstart Kit preview URL that calls the wrapper: http://localhost:9080/cs?childpagename=avisports/AVIArticle/AggregatedContent&cid=1330881053255&c=AVIArticle&pagename=XMLWrapper, the result is:

Screen Shot 2014-06-09 at 1.51.05 PM

As you can see, the response to the browser is XML (and because of the cs.contenttype-application/xml resarg, the browser interprets it correctly). Also note it includes the aggregated content (the embedded Image asset here simply converted to a “usable” URL) such that the caller does not need to know or care where the other content came from. Note that this simple pattern of using an XMLWrapper will work for ANY rendering Template for ANY assettype! It must be pointed out that the amount of aggregation possible can be anything you want – it doesn’t just have to be to convert embedded assets in a CKEdit attribute, but could include related assets, breadcrumb, navigation, brother/sister assets, etc. The possibilities are limitless here. Basically anything you want to hide from the caller but want to include in the content response for any given asset.

Using this model to Publish XML.

Without belaboring the point, we now have a fairly powerful integration point with WCS. If we wanted to export all the assets as “aggregated” XML it would be fairly trivial to do. Rather than create a custom publish mechanism (a fairly involved exercise that is beyond the scope of this blog post), I’d like to demonstrate a quick and easy way for you to see how it all could work. For the purposes of this blog, let’s resurrect the deprecated Static Publishing mechanism to demonstrate this. To make Static Publishing work with 11.1.1.8 you need to update your futuretense.ini property file:

advancedUI.enableAssetForms=true

Changing this setting from false (the default as of 11.1.1.8) to true will allow you to set Starting Points and other needed features to gain access to the deprecated Static Publishing mechanism.

Once in your Admin Screen, go to Publishing and create a Static Publishing Destination:

Screen Shot 2014-06-09 at 2.01.28 PM

You’ll need to create and set a Starting Point (you can read much more in the Admin Guide, so I won’t go into the details here). But a typical Starting Point would be a Page asset with a Template that has a list of links to Articles. In effect, with a Starting Point you are creating effectively a sitemap (or subset therein) that the system will use to “crawl” the Static Publishing looking for new pages to publish. For our demo we can use our Article as a Starting Point. Note that here we are specifying the XMWrapper as a wrapper (as shown in the screenshot below) — the result is, as you might have guessed, that our asset will export static, aggregated XML.

Screen Shot 2014-06-09 at 2.04.16 PM

NOTE: The exported static files will have a .html extension (apparently hardwired into Static Publishing), but they are indeed XML.

Introduction to FMW 12C REST adapter

$
0
0

In this entry, I will present the usage of the new REST adapter using OSB 12C.  This adapter is a new feature providing support for RESTful web services; it also supports JSON request and response.   One important thing to keep in mind is the internal implementation of the message within OSB is SOAP/XML, there are transformations between JSON and XML during the message processing.

 

Use Case

 

I have deployed a JAX-RS service on the server with the following interfaces and we are going to expose this on OSB.

 

@PUT
@Path(“vip”)
@Produces(MediaType.APPLICATION_JSON)
@Consumes({MediaType.APPLICATION_JSON})

public Response putUserVIP(CompoundObject element);

 

@PUT
@Path(“putTest/{year}/{month}/{day}”)
@Produces(MediaType.APPLICATION_JSON)
@Consumes(MediaType.APPLICATION_JSON)

public Response complexRest(
             CompoundObject obj,  
             @PathParam(“year”) int year,
             @PathParam(“month”) int month,
             @PathParam(“day”) int day);

 

@GET
@Path(“getTest/{year}/{month}/{day}”)
@Produces(MediaType.APPLICATION_JSON)

public Response getUsers(
                    @PathParam(“year”) int year,       
                    @PathParam(“month”) int month,
                    @PathParam(“day”) int day,
                    @QueryParam(“from”) int from,
                    @QueryParam(“to”) int to,
                    @QueryParam(“orderBy”) String orderBy);

 

@POST
@Path(“otherwise/{year}/{month}/{day}”)
@Produces(“application/json”)

public Response otherwise(
                @FormParam(“name”) String name,
                @FormParam(“id”) int id,
                @PathParam(“year”) int year,
                @PathParam(“month”) int month,
                @PathParam(“day”) int day);

 

The inbound  CompoundObject is of the format

{ “key” :  “keyname” , “value” : “valuetext” }

 

The response is in the format

{  “value” : “valuetext” }

 

Steps

 

Step 1     Launch JDeveloper and start the integration server

 

Step 2     Create a new service bus application with service bus project

 

Step 3     Import the WADL of the JAX-RS service as Service Bus Resources

 

Right click on the new project-> Import -> Service Bus Resources -> Resources from URL

 

RestAdapter12cFig1

 

RestAdapter12cFig2

Step 4    Create a REST service reference

 

Drag the REST adapter to the External Services swim lane.

 

RestAdapter12cFig3

 

Step 5     Define the REST bindings

 

In the operation bindings,  select Add operations based on WADL service and use the wadl just imported.

 

You will get the warning dialogs and take some time to read it.   The warnings are reminders that we need to provide the operations bindings details.

 

RestAdapter12cFig4

RestAdapter12cFig5

 

Since the operation bindings are not completed, we need to modify each one and provide the operation details:

 

  1.      1. Request and Response schemas
  2.      2. Request and Response payload types
  3.      3. Request URI parameters details

 

For getUsers operation request bindings, we need to provide the mappings between the internal SOAP/XML payload to the URI parameters when it is invoking the outbound call.   In this case since this is a GET operation and the values are from the payload in the format of:

 

$msg.request/<parameter name>

 

RestAdapter12cFig6

 

In the Response dialog, we need to provide the schema for the result.  Since we do not have the xsd schema defined yet,  we are going to use the native format builder to create one for us.

Click on the Define Schema for Native Format icon on the right corner.

 

RestAdapter12cFig7

The native format builder allows us to define different the schema from various format, here we use JSON.

 

RestAdapter12cFig8

Let’s use a sample payload and the builder will generate the schema for us.

 

RestAdapter12cFig9

RestAdapter12cFig10

 

For the complexRest operation.  In the Request tab,  since this is a PUT with a JSON payload, the URI parameters cannot be mapped to the same payload.  In this case we will use the expression in the following format for the mappings, at runtime, the values can be provided in the outbound variable’s user-metadata.

 

$property.<property name>

 

RestAdapter12cFig11

 

For the otherwise operation, we have both template parameter and URL-encoded payload, so the request URI parameters mappings look like this:

 

RestAdapter12cFig22

Once we completed all the details for every operation, the external business is generated.

 

RestAdapter12cFig13

We can test the service by right click on the external service and select Run

 

RestAdapter12cFig14

 

RestAdapter12cFig15

RestAdapter12cFig16

 

Step 6   Create pipeline and expose as REST

 

Next we will create a pipeline, connect the business service and expose the pipeline as REST.

 

Right click in the Pipeline/Split joins swim lane and select create pipeline and connect the pipeline and RestReference service upon completion.

Note:  Do not select expose as proxy service yet.

 

RestAdapter12cFig17

Right click on the RestPipeline and select expose as REST

 

RestAdapter12cFig18

You will need to complete the REST bindings for the proxy service similar to the business service.  In our example, we just reuse the schemas and mappings created earlier,  but you are free to define your own interfaces and schemas.

 

RestAdapter12cFig19

 

Step 6   Complete the pipeline for pass through actions

 

If you open the pipeline design, you will see the pipeline is just a straight pass through to the back end using the inbound operation for outbound request.

 

RestAdapter12cFig20

 

If you try to execute the PUT/POST operation that also has URI parameters (putTest, otherwise), you will receive an error complaining missing parameters.   This is because the OSB store those URI parameters in the inbound variable and they need to populate to the outbound variable.

 

<con:endpoint name=”ProxyService$RestfulProject$RestService” xmlns:con=”http://www.bea.com/wli/sb/context”>

<con:transport>

<con:uri>/RestfulProject/RestService</con:uri>

<con:mode>request-response</con:mode>

<con:qualityOfService>best-effort</con:qualityOfService>

<con:request xsi:type=”http:HttpRequestMetaData” xmlns:http=”http://www.bea.com/wli/sb/transports/http” xmlns:xsi=”http://www.w3.org/2001/XMLSchema-instance”>

<tran:headers xsi:type=”http:HttpRequestHeaders” xmlns:tran=”http://www.bea.com/wli/sb/transports”>

<http:Accept>application/json</http:Accept>

<http:Content-Type>application/x-www-form-urlencoded</http:Content-Type>

</tran:headers>

<tran:user-metadata name=”year” value=”2014″ xmlns:tran=”http://www.bea.com/wli/sb/transports”/>

<tran:user-metadata name=”month” value=”5″ xmlns:tran=”http://www.bea.com/wli/sb/transports”/>

<tran:user-metadata name=”day” value=”6″ xmlns:tran=”http://www.bea.com/wli/sb/transports”/>

<http:query-parameters/>

</con:request>

 

In order to pass the parameters to the backend web services,  we need to insert the meta data if it exists.   We will add a  IF-THEN action in the request pipeline and if there is any user-meta data presents in the inbound request, we populate them in the outbound request:

 

IF 

fn:exists($inbound/ctx:transport/ctx:request/tp:user-metadata)

THEN INSERT

$inbound/ctx:transport/ctx:request//tp:user-metadata

AS LAST CHILD OF VARIABLE

outbound

XPATH

./ctx:transport/ctx:request

 

 

RestAdapter12cFig21

Once its is completed, you can test the external REST service.

 

Summary

 

In this blog, I showed the basic usage of REST adapter:

 

  • Internal implementation as SOAP/XML
  • How to define REST business and proxy service
  • Use of native format builder to define schema for transformation
  • Use of $msg.request and $property expression in different use cases for request bindings
  • URI parameters stored as user-metadata in the inbound context variable for PUT/POST operations.

 

JDeveloper projects are here:  CodeSampleRestAdapter12c

 

Creating a Mobile-Optimized REST API Using Oracle Service Bus – Part 1

$
0
0

Introduction

To build functional and performant mobile apps, the back-end data services need to be optimized for mobile consumption. RESTful web services using JSON as payload format are widely considered as the best architectural choice for integration between mobile apps and back-end systems. At the same time, most existing enterprise back-end systems provide a SOAP-based web service application programming interface (API) or proprietary file-based interfaces. In this article series we will discuss how Oracle Service Bus (OSB) 12c can be used to transform these enterprise system interfaces into a mobile-optimized REST-JSON API. This architecture layer is sometimes referred to as Mobile Oriented Architecture (MOA) or Mobile Service Oriented Architecture (MOSOA). A-Team has been working on a number of projects with OSB 12c to build this architecture layer. We will explain step-by-step how to build this layer, and we will  share tips, lessons learned and best practices we discovered along the way. In this first part we will discuss how to design the REST API.

Main Article

Design Considerations

Let’s start with the first challenge: how do you design an API that is truly optimized for mobile apps? A common pitfall is to start with the back-end web services, and take that back-end payload as a starting point. While that may limit the complexity of transformations you have to do in OSB 12c (you could even use the automated “Publish-As-REST” function on a SOAP business service) it leads to an API which is everything but optimized for mobile. This brings us to our first recommendation:

The REST API design should be driven by the mobile developer.

He (or she) is the only one who can combine all the requirements, information and knowledge required for a good design:

  • he designs and builds the various screens, knows the supported form factors and knows exactly which data should be retrieved for which screen.
  • he knows the requirements for working in offline mode, and knows how this can be supported and implemented using his mobile development tool set.
  • he is responsible for data caching strategies to optimize performance in both online and offline scenarios
  • he decides which read and write actions can be performed in a background thread not impacting the user-perceived performance.

To illustrate how the above aspects impact the design of the API, we will introduce the sample “human resources” app that we will use throughout this article series. Lets start with the three screen mockups our API should support:

mockups

A first design for the read (GET) resources can look like this

  • /departments: returns list of departments containing department number and name. A “quickSearch” query parameter might be added to support filtering if this cannot be implemented or is undesirable to perform on the device because of the size of the dataset.
  • /departments/{departmentId}: returns all department attributes for the department matching the {departmentId} path parameter and a sub list of all employees working in this department consisting of id, firstname and lastName attributes.
  • /departments/{departmentId}/employees/{employeeId}: returns all employee attributes for the employee matching the {employeeId} path parameter.

As you can see this design is driven by the screens. It allows for “on-demand” data loading, using lean resources that only send the absolutely neccessary set of data across the wire, minimizing payload size and maximizing performance. This design is clearly optimized for online usage of the application.  If the mobile developer has to support an offline usage scenario, he would need to do the following to prepare the app for offline usage, storing all data locally on the device:

  • Call the /departments resource
  • Loop over all the departments returned, and for each department call the  /departments/{departmentId} resource.
  • Loop over all employees returned, and for each employee call the /departments/{departmentId}/employees/{employeeId} resource

Needless to say that this is not a very efficient way of data loading for offline usage. It can easily result in hundreds of REST calls causing a delay of many minutes to prepare the app for offline usage. So, to support offline usage, it would be handy to add a query param “expandDetails” to the /departments resource which when set to “true” would return all department and employee data in one roundtrip.

Of course there are limits to the amount of data you can offload to your phone or tablet. You sure don’t want to store the complete enterprise back-end database on your phone!  So, in our sample, depending on the number of departments in the back-end database, we might need additional query parameters to allow the mobile user to select a specific subset of departments for offline usage.

At this point you might think, no worries, I only have to support an online usage scenario. Well, not too fast… A-team has learnt from our experiences in mobile engagements that more aggressive upfront data caching strategies might still be needed for various performance-related reasons:

  • The app might be used in parts of the world where network connectivity and network bandwidth is unreliable so users prefer to have a longer waiting time at app startup to prevent network hick ups while using the app.
  • The performance of the back-end API calls might turn-out to be too slow for “on-demand” data loading.
  • The REST-JSON transformations in service bus are typically very fast. However, the required JSON payload might require assembling data from various back-end data sources, seriously degrading performance when looping over result sets is needed to get additional lookup data.

Let’s clarify this last point with an example. Assume there is a back-end HR interface that returns all employee data except for the job title (only job code is returned). Another “lookup” interface returns the job details including the job title. In service bus, a loop over all employees is then needed, and for each employee a call to the jobs “lookup” interface is needed to add the job title to the payload. If the lookup call takes just one second, the performance loss can already be significant with tens of employees returned.  In such a situation you have two options: cache the job lookup data in service bus to prevent expensive calls for each employee, or modify the JSON payload to pass only the job id and do the job lookup on the mobile device. This last option would require an additional /jobs resource in your design that returns all job titles for you to cache on the device.

In summary: various user groups might have different data caching needs, and initial data caching strategies might need to be revisited for performance reasons.

We can distill an important lesson from the above:

Your REST API design should be flexible in terms of the data caching options it can support.

Documenting the Design

Developers building traditional enterprise system interfaces often follow a design-by-contract approach. In the XML-based web services world, this means the use of XML Schema’s (XSD’s) and Web Service Definition Language (WSDL) to formally specify the interfaces. However, mobile developers live in a different world, they just think in JSON payloads! While there are emerging standards in the REST-JSON world like RAML and SWAGGER, most mobile developers currently prefer to use sample JSON payloads to document the design. Sample payloads can be changed easily, reflecting the agile and flexible nature mobile developers are used to when working with JavaScript-based mobile frameworks like Angular or Ionic.

So, as we started off with recommending to have the mobile developer drive the design, we should facilitate him with a documentation format he is comfortable with: sample payloads, together with the resource path, path and query parameters, the HTTP method and a short description. Optionally, you can add security constraints. Here is the design of our HR sample REST-JSON API based on the above screens.

Resource Method Description Req/Resp Payload
/departments
?expandDetails=true/false
GET Department list (query param is optional) Sample
/departments/{departmentId} GET Department details and list of employees Sample
/departments/{departmentId}/employees/{employeeId} GET Employee details Sample
/departments/{departmentId}/employees POST Add new employee to department Sample
/departments/{departmentId}/employees/{employeeId} PUT Update employee Sample
/employees GET Employee list Sample
/jobs GET Job list Sample

The last two resources are needed to populate the manager and job drop down lists when updating employee data.

Another advantage of specifying and agreeing on the sample payloads upfront is the ability to start developing the mobile UI against a mock-up REST API while the service bus developer is busy implementing the real API. There are some great free tools on the marketplace that make it easy to create such a mock-up API. A-Team succesfully used a combination of NodeJs,Express and MongoDB to do this. With MongoDB you can read and write any JSON payload without specifying a schema upfront, which makes it very easy to change the payload formats as you go. While we had no prior knowledge of these technologies, we got a working mockup API in a few hours, after following this tutorial.

Implementing the Design

In part 2 of this article series we will start implementing this design using Oracle Service Bus 12c. We will integrate with a back-end interface that is provided in the form of an ADF BC SDO SOAP web service. In subsequent parts, all aspects of the service bus implementation will be discussed, including XML schema’s, use of pipeline pairs en routers, pipeline templates, error handling, logging and security. No prior knowledge of service bus will be assumed. Stay tuned!

Creating a Mobile-Optimized REST API Using Oracle Service Bus – Part 2

$
0
0

Introduction

To build functional and performant mobile apps, the back-end data services need to be optimized for mobile consumption. RESTful web services using JSON as payload format are widely considered as the best architectural choice for integration between mobile apps and back-end systems. At the same time, most existing enterprise back-end systems provide a SOAP-based web service application programming interface (API) or proprietary file-based interfaces. In this article series we will discuss how Oracle Service Bus (OSB) 12c can be used to transform these enterprise system interfaces into a mobile-optimized REST-JSON API. This architecture layer is sometimes referred to as Mobile Oriented Architecture (MOA) or Mobile Service Oriented Architecture (MOSOA). A-Team has been working on a number of projects with OSB 12c to build this architecture layer. We will explain step-by-step how to build this layer, and we will  share tips, lessons learned and best practices we discovered along the way. In part 1 we discussed the design of the REST API, in this second part we will discuss the implementation of the “read” (GET) RESTful services in service bus by transforming ADF BC SDO SOAP service methods

Main Article

Getting Started

As of release 12.1.3 you can develop and test service bus applications inside JDeveloper. For this you need to download and install a separate “SOA Suite Quick Start” version of JDeveloper. Download page is here, installation instructions can be found here.

The SOA Suite Quick Start release of JDeveloper 12.1.3 has the same version number as the “vanilla” JDeveloper 12.1.3 release. This means that by default they will use the same system directory. This can cause weird and unexpected behavior. You need to make sure both JDeveloper releases use their own system directory by setting the JDEV_USER_HOME environment variable in the executable file that you use to launch JDeveloper (custom .bat file on Windows, JDeveloper Unix executable file inside package contents on Mac).

After starting the JDeveloper 12.1.3 release that comes with the SOA Suite Quick Start, you go to the File -> New -> Application gallery and choose Service Bus Application with Service Bus Project.

NewSBApp

We name the application and its project “HrApi”.

The input data for the HR Rest API is primarily coming from an ADF Business Components application where we exposed some of the view objects as SOAP web services using the application module service interface wizards. So, before we start building out the service bus application, we need to have our ADF BC SDO SOAP service up and running. We can test the service using SOAP UI, or using the Http Analyzer inside JDeveloper. As shown in screen shot below, the findDepartments method return a list of departments, including a child list of employees for each department.

HttpAnalyzer

There is also a getDepartments method that takes a departmentId as parameter, this method returns one department and its employees. We will use these two methods as the data provider for the two RESTful resources that we are going to create in this article:

Resource Method Description Req/Resp Payload
/departments
?expandDetails=true/false
GET Department list (query param is optional) Sample
/departments/{departmentId} GET Department details and list of employees Sample

Creating the External HR SOAP Business Service

We start opening the service bus overview diagram by double-clicking on the HrApi overview icon in the navigator window.

SBOverviewEmpty

In the components palette at the right, we now select the HTTP icon (Technology category) and drag and drop this onto the External Services area. The Create Business Service wizard is launched. We name the service “HRBusiness Service” and we change the location to ensure the WSDL file and XML Schema files are stored in a dedicated directory for this business service, This avoids confusion later on when we start developing other business services and proxy services.

CBS1

After clicking Next, we select the WSDL option and we use the icon at the right of the WSDL field to launch the Select WSDL wizard. In this wizard, we click on the Application Server icon and enter the WSDL of our ADF BC SOAP service in the Selection field.

CBS2

After clicking OK, the Import Service Bus Resources wizard appears. On the first page we change the location again to store the resources in a business service specific sub-directory.

CBS2-1

When we click Next, we get an overview of the resources that will be imported. Looking at the long list and meaningless names, you can see why it is handy to store them in their own directory so you at least know where they are coming from.

CBS2-2

After clicking Finish, we return to page 2 of the Create Business Service wizard which is now showing our WSDL information including the port information.

CBS2-3

We now advance to the last wizard page which shows the transport and endpoint information which we can leave as is.

CBS3

After clicking Finish, the business service appears in the External Services area in our overview diagram.

SBOverviewES

It might happen that the ADF BC SOAP service changes after you created the external business service on the diagram. Service bus doesn’t provide an out-of-the-box refresh or refactoring option, however, here is how you can update the WSDl and associated schemas afterwards: create a new external HTTP service using the same WSDL by repeating the above steps, except for the last step: after returning from the Import Service Bus Resources wizard, you click Cancel to prevent creation of the business service on the diagram. Since you used the same import location, the existing business service will now use the latest ADF BC WSDL and schemas.

Creating the REST Proxy Service

We start with dragging and dropping the REST icon from the components palette (Technology category) onto the Proxy Services area of the overview diagram. In the Create Rest Binding dialog that appears we add a resource path /departments.

CRB1

Then we click on the plus icon in the Operaton Bindings section to add an operation binding. The REST Operation Binding dialog appears. We give the operation a meaningful name, for example getDepartments. We set the HTTP Verb to GET, and specify the expandDetails query parameter as specified in the design. We can leave the Expression field blank. This field will automatically be filled when we exit the dialog later on by clicking OK.

CRB1.1

Next, we click on the Response tab. We mark the response as a JSON payload and then we need to define the XML schema element that defines the structure of the response. You might wonder “Why do we need an XML schema when we are returning a JSON payload?” The reason is that service bus internally works with XML only, we need to supply it with a schema element so service bus knows how to construct the JSON payload. Note that in a future release service bus engine will have native support for REST and JSON allowing us to specify the payload structure in emerging REST standards like RAML and SWAGGER.

But don’t worry if you are not familiar with writing XML schemas, service bus comes with a Native Format Builder wizard which allows you to create an XSD using a sample JSON payload. To launch this wizard, click the red-circled icon as shown in the screen shot below.

CRB2

On the first page, we enter a meaningful name for the XSD and we ensure the schema will be stored in the default Resources directory. Do not change this directory as it will cause errors in the .wsdl and .wadl files that are generated based on this definition (known issue).

NFB1

On the next page, we leave the type to JSON Interchange Format.

NFB2

On the third page we modify the target namespace, and root element name, and we use the sample payload that we created as part of our design in part 1 of this article series.

NFB3

The last page of the wizard shows the XSD as it will be created. Here is the full content of the HRRestService.xsd:

<?xml version = '1.0' encoding = 'UTF-8'?>
<xsd:schema xmlns:xsd="http://www.w3.org/2001/XMLSchema" 
xmlns="http://oracle.com/mobile/HRRestService_getDepartmentList_response" 
targetNamespace="http://oracle.com/mobile/HRRestService_getDepartmentList_response" 
elementFormDefault="qualified" xmlns:nxsd="http://xmlns.oracle.com/pcbpel/nxsd" 
nxsd:version="JSON" nxsd:jsonTopLevelArray="true" nxsd:encoding="US-ASCII">
   <xsd:element name="DepartmentListResponse">
      <xsd:complexType>
         <xsd:sequence>
            <xsd:element name="topLevelArray" maxOccurs="unbounded">
               <xsd:complexType>
                  <xsd:sequence>
                     <xsd:element name="id" type="xsd:integer"/>
                     <xsd:element name="name" type="xsd:string"/>
                  </xsd:sequence>
               </xsd:complexType>
            </xsd:element>
         </xsd:sequence>
      </xsd:complexType>
   </xsd:element>
<xsd:annotation xmlns="">
      <xsd:appinfo>NXSDSAMPLE=</xsd:appinfo>
      <xsd:appinfo>USEHEADER=false</xsd:appinfo>
   </xsd:annotation>
</xsd:schema>

There are two important things to note when looking at this XSD:

  • The maxOccurs property with value “unbounded” is added because in the sample payload we included two departments in the array. If the sample included only one department, then this property would have been left out, and we would have to add it manually afterwards. This property is important to get a loop over the departments when mapping the SOAP response to the JSON format using XQuery as we will see later.
  • The element named “topLevelArray” is not included in the REST response because inside the schema header element, the property jsonTopLevelArray is set to “true”.

When we click Finish in the Native Format Builder wizard, we return to the Rest Operation Binding dialog which now shows the XSD element that should be used for the response:

ROB1

We can click OK and we return to the Create Rest Binding dialog which shows our first complete resource definition:

CRB4

When we click OK we return to the overview diagram which now includes our REST proxy service.

OD2

The red cross icon indicates that we are not done yet, we still need to wire up the proxy service with a pipeline which in turn links to the business service. We will do this in the next section.

To create the department details resource, we can largely follow the same procedure. However, instead of creating a separate schema using the Native Format Builder dialog, we now manually add the DepartmentDetailsResponse element to the HRRestService.xsd we created with the department list resource. It is cleaner to group all elements related to one proxy service in one schema, and it eases refactoring of the schema, enabling reuse of element definitions, as we will see later on in part 3 of this article series.

However, there is one problem here: when setting the jsonTopLevelArray property to true in the header of the schema definition, we can only have one top-level element in the schema. So, we need to set this property to false if we want to add the DepartmentDetailsResponse element. When we do this we should also rename the top-level element as it will now appear in the JSON payload:

TopLevelArray

The JSON payload that will be returned will now look like this:

{"departments" : [ 
  { "id" : 10,
  "name" : "Administration"
  }, 
  {"id" : 20,
  "name" : "Marketing"
  }
]}

This is a slight deviation from our initial design but should not cause any real problems for the mobile developer consuming this resource.

We can now add the DepartmentDetailsResponse element definition to HRRestService.xsd:

 <xsd:element name="DepartmentDetailsResponse">
    <xsd:complexType>
       <xsd:sequence>
          <xsd:element name="id" type="xsd:integer"/>
          <xsd:element name="name" type="xsd:string"/>
          <xsd:element name="location" type="xsd:string"/>
          <xsd:element name="managerId" type="xsd:integer"/>
          <xsd:element name="managerName" type="xsd:string"/>
          <xsd:element name="employees" maxOccurs="unbounded">
             <xsd:complexType>
                <xsd:sequence>
                   <xsd:element name="employeeId" type="xsd:integer"/>
                   <xsd:element name="firstName" type="xsd:string"/>
                   <xsd:element name="lastName" type="xsd:string"/>
                </xsd:sequence>
             </xsd:complexType>
          </xsd:element>
       </xsd:sequence>
    </xsd:complexType>
 </xsd:element>

Although we add this new element to the existing XSD, we still can use the Native Format Builder to save us some tedious typing: we can launch the wizard, use the sample payload as specified for the department details resource and copy the generated XSD element from the last wizard page to the clipboard, and then cancel the wizard. We then can paste the element into the existing XSD.

We now re-enter the Rest Proxy Binding dialog by right-mouse-clicking on the proxy service and then choose Edit REST.

EditRest

We add resource path: /departments/{departmentId} and specify the operation binding name getDepartmentDetails. Rather than clicking the icon to launch the Native Format Builder to create a new CSD, we now click the lamp icon which allows us to select the DepartmentDetailsResponse element type from the HRRestService.xsd that we just added manually.

ChooseResponseElement

We click OK twice, and we return to the Create Rest Binding  dialog which now shows two completed resource definitions.

CRB5

Creating the HR Pipeline

We are ready for the last and most important step: adding a pipeline that links the Rest proxy service which the SOAP business service and performs the SOAP-to-JSON transformation for both resources.We start with dragging and dropping the Pipeline icon (under Resources category) onto the Pipelines/Split Joins area of the overview diagram. This launches the Create Pipeline Service dialog. On the first page we set the name to HRPipeline.

CPS1

On the second page, we select  the WSDL radio button, and then using the Browse WSDL’s icon at the right, we can pick the HRRestService.wsdl which was auto-generated for us when we defined the REST resources in the proxy service. You might be confused at this point why we need to select the proxy service WSDL and not the business service WSDL. Well, to keep it simple: we want to create a mobile-optimized REST API, which means we take the proxy service as a starting point. Which means we are working from left to right in service bus.

CPS2-1

We also need to uncheck the checkbox Expose as a Proxy Service, since we already defined our proxy service, and we want to take our proxy service WSDL as the starting point for the pipeline transformation.

CPS2-2

After clicking Finish, the pipeline is added to the diagram, and we can connect both the proxy service and the business service with the pipeline by dragging a line between the arrows in the three services. The diagram now looks like this:

SBOReady

We double-click on the pipeline which opens up the design view of the pipeline:

PL1

We start with dragging and dropping an Operational Branch from the component palette onto the diagram. We release the mouse on the circle in between the top-level HRPipeline icon and RouteNode1,and the diagram now looks like this.

PL2

The default branch will be executed when no REST operation is specified. This will never happen in our scenario, so we don’t need it. We cannot completely remove the default branch, but nothing should happen. So, we move the RouteNode2 with nested Routing element to the getDepartmentList branch. We can do this by clicking on the icon in the upper left corner of RouteNode 2 and then drag and drop it onto the circle that appears just below the getDepartmentList.We also need another branch for the getDepartmentDetails method. We can add a branch by clicking on the upper right icon inside the BranchNode1 element. A new branch appears and we can use the Operation dropdown list in the properties window of the branch to select the correct operation.

PL3

We need a route node with nested routing inside the getDepartmentDetails branch just like we have inside the getDepartmentList branch. We can do this by dragging and dropping the Route icon from the component palette, and then drag and drop the Routing icon inside it. Or, we can copy RouteNode2 using the popup menu that appears when we right-mouse-click on the RouteNode2 icon, and then paste it onto the GetDepartmentDetails branch.

CopyNode

The diagram now looks like this

PL4

When we click on the Routing element inside RouteNode2, we can specify to which business service method this request must be routed using the Operation dropdown list in the property inspector. In our sample, this is the findDepartments method.

Routing

Likewise, we specify method getDepartments as the Operation for the Routing element inside the getDepartmentDetails branch.

Before we continue it is time for a little explanation. Depending on the REST operation requested through the proxy service, the corresponding operation branch will be executed. Within a branch we have a routing consisting of a Request Action and a Response Action. The Request Action route is executed when the proxy service operation is invoked. The Response Action route is executed when the business service method returns a response. In other words:

  • in the Request Action route we need to transform the JSON query and path parameters and/or JSON payload (not applicable for GET requests) to a SOAP request body that is understood by the ADF BC SOAP Service.
  • in the Response Action route we need to transform the SOAP response body to the JSON payload.

Now, with the risk of loosing you, I am going to sharpen the above explanation because it is not entirely in line with how service bus really works. As mentioned before, service bus uses only XML internally. So, the moment a REST-JSON request arrives, the payload is immediately converted to XML, using the WSDL/WADL and schema created while defining the proxy service. This XML payload together with the query and path parameters is the input for creating the SOAP request body.

=> In the Request Action path, we need to tell service bus how the proxy service XML can be transformed into business service SOAP request body XML.

Likewise, the SOAP response is not  directly converted to JSON, it is first transformed into an XML structure that maps to the XML schema element type used to define the required JSON payload. Only just before the response “leaves” service bus it is tranformed into JSON format.

=> In the Response Action, we need to tell service bus how the SOAP response body XML can be transformed to the proxy service response XML payload.

So, all transformations are from XML to XML and we “tell” service bus how to do these transformations using XQuery. There are other transformation options in service bus, we will explain later why using XQuery is such a good idea.

Let’s start with transforming the getDepartmentDetails SOAP response. The actual transformation is done using a “Replace” action, and in the properties of the Replace action we can specify the XQuery resource that should be used for the transformation. We first create the XQuery file. We can do this using the File -> New menu, choosing XQuery File ver 1.0.

NewQRY

The Create XQuery Map Main module dialog appears. Since you will typically get a lot of XQuery files, it is important to give them meaningful names and store them in their own sub-directory (“transformations” in this sample). Any naming convention is fine, we typically use a logical name suffixed with “BS2PS” or “PS2BS” indicating whether the transformation direction is from business service to proxy service or vice versa.

MAP1

An XQuery file performs a transformation and for this transformation it needs an input and an output. The actual value of the input and output is defined when we use the XQuery file for a specific transformation, for example inside a Replace action. Making an analogy with Java, you could see the XQuery file as a reusable Java method with one or more input parameters and one output parameter, and the actual parameter values are passed in when calling the method. So, in this dialog, we only define the parameter types, or in the vocabulary of XQUery, we define the source parameter types and the target type.We will start defining the source parameter type. We click on the plus icon at the right of the (empty) Sources list, and a Function Parameter dialog appears.We can give the source parameter a name, for example “input”

MAP2

Then we click on the edit icon at the right of the Sequence Type field. The Function parameter type dialog appears.In this dialog we click the Type Chooser icon at the right of the Schema Object Reference field, so we can pick our ADF BC response element type. You might wonder which XSD node to open, but you can simply expand the top-one, it will automatically expand the XSD node that contains all the HR element types. Since we are invoking the getDepartments method, we select getDepartmentsResponse element type.

MAP3

We click OK three times which brings us back to the base XQuery file dialog which now shows the input parameter together with its type definition.

MAP4

We define the output (target) type in a similar fashion as the input parameter type. In the Type Chooser dialog we now open the HRRestService.xsd and select the DepartmentDetailsResponse element type.

MAP5

After clicking OK twice we return to the base dialog.

MAP6

We are done setting up source and target, so we click OK which brings up the XQuery Mapper diagram. This diagram provides a nice visual way to define the mappings by dragging and dropping lines from source elements to target elements.

QueryMapper

If you are curious abut the XQuery language, you can click the XQuery Source tab to see the code that is generated while using the visual mapper.

We now drag and drop a Replace action from the component palette (Message Processing category)  onto the circle that appears just below the Response Action.

Replace

In the Replace – Properties window, we set the Location to Body, and the Replace Option should be set to Replace Node Contents. To use our XQuery mapping file we just created, we click at the icon at the right of the Value field, and choose XQuery Resources.

ReplaceProps

In the XQuery Transformation Expression Builder dialog that appears, we use the search icon to select our DepartmentDetailsBS2PS mapping file into the XQuery Resource field.

UseMapper1

Our source parameter that we named “input” automatically shows up in the XQuery Variables section. We now need to set the “binding” of this variable. To enter a binding expression we click the icon to launch the XQuery Expression Builder. Unfortunately, the builder only provides help with selecting the body from the proxy service: if you expand the body, you see the two possible body elements getDepartmentList and getDepartmentDetails. That is helpful later on when we specify the Request Action later on, but is not useful, and even confusing while specifying the Response Action.  So, how do we know what to enter as expression? The easiest way is to run the getDepartments method in SOAP UI or in the HTTP Analyzer of JDeveloper:

getDeaprtmentsResponse

We need to return the first element inside the body, element, which results into the following expression:

$body/ns0:getDepartmentsResponse

When we enter this expression in the builder dialog, you will notice the red underlining of the expression, as shown below:

UseMapper2

This is caused by a missing namespace definition. We need to click on the Namespaces tab, click the plus icon, and add our namespace as defined in the SOAP response we got in in the HTTP Analyzer.

AddNamespace

After clicking OK, the new namespace is added and the red underlining of the expression is gone.

UseMapper3

We click OK twice and the Replace – Properties window now shows our XQuery transformation file used in the Value property of the Replace action.

ReplaceProps2

Admittedly, this was not a trivial step to do, however, once this issues with the builder dialog showing the wrong body elements is fixed in a future release of service bus, it will be faster and more intuitive.

OK, onto the Request Action path. Here we need to do a transformation in the opposite way. The input of the transformation is the departmentId path parameter and the output is the business service request body, which , as we learned from looking at the HTTP Analyzer, should look something like this:

   <env:Body>
      <ns1:getDepartments>
         <ns1:departmentId>20</ns1:departmentId>
      </ns1:getDepartments>
   </env:Body>

You might wonder, do we really need to use an XQuery mapping file for this simple transformation? It is just fixed text, only the value of departmentId is dynamic, The answer is no, we can also use a straight XPath expression to construct the request body. To illustrate this, we first drag and drop a new Replace action onto the circle just below the Request Action of the getDepartmentDetails branch. We select Body as the value for Location again, and we select the Replace Node Contents radio button. Selecting this radio button ensures that we only replace the content of the request, being the elements inside the <env:Body> element.

To set the Value field, we click on the icon and this time we choose Expression. In the expression builder, we first copy the sample body from above, we remove the value 20 which we need to replace with an XPath expression to rertrieve the departmentId path parameter. This time the expression builder is of great help, we can nicely expand the getDepartmentDetails request body to get to the departmentId parameter. We click Insert Into Expression and then we surround the expression with data(). This is because the expression returns the departmentId XML element, not the value itself. Surrounding it with data() will return the value which is what we need here.

We are almost done now, we only have to add a namespace to get rid of the red underlining. However, the required namespace is the same as we needed in the Response Action, and since namespaces are shared within a routing, we can simply reuse that namespace definition. All we need to do is change ns1 to ns0 since that was the prefix we used in the Response Action. So, the correct expression looks like this:

DataFix

Now, this was easier and faster than using an XQuery file transformation, right? Still, we believe it is better to use XQuery transformation for serious service bus applications as is explained in the next tip.

Although service bus supports other options for XML transformations, we recommend to always use XQuery transformation files. By exclusively using XQuery files, refactoring as a result of changes to business or proxy services can be done locally within the XQuery files. The pipeline definitions can remain unchanged, and since there are no transformations “hardcoded” inside XPath expressions, there is no (or at least much less) risk of incomplete refactoring exercises.

Let’s follow the recommendation and create an XQuery mapping file instead.We create a new XQuery file and name it “DepartmentDetailsPS2BS”. This time the source departmentId parameter is not a complex type but a simple string.  We select string by expanding the XML Schema Simple Types node in the Type Chooser:

MAPSimple1

For the target, we select the getDepartments element:

MAPSimple2

The XQuery dialog looks like this

MAPSimple3

We click OK to get the mapper diagram. We first need to insert the departmentId element in the target type using the popup menu.

MAPSimple4

And then we can draw a line between the source and target departmentId elements.

MAPSimple5

We now go back to the Replace action and change the Value expression to use our XQuery transformation file. The binding expression for the departmentId variable can easily be inserted using the builder dialog, just like we did when using the straight XPath expression.

DataFix2

That’s it, we are done with specifying the getDepartmentDetails branch.

Overview

We will leave it as an exercise for you to specify the request and response actions for the getDepartmentList branch. Note that you can ignore the expandDetails query parameter for now, we will discuss the implementation of this parameter in the next part of this article series. If you get stuck, you can take a look at the sample project, download links can be found at the bottom of this post.

Testing the REST Proxy Service

Service bus provides an easy way to test your services inside JDeveloper. You can right-mouse-click on each service and then choose Run.

run1

This will deploy the complete service bus application and will launch a tester window.

Run2

When we click the Execute button, the department list is shown:

Run3

Likewise, we can test the department details resource using departmentId 20:

Run4

Which results in the following response:

Run5

We can also test the REST resources outside JDeveloper. Postman is an excellent Google Chrome extension to test RESTful services. To use Postman, we need to know the full endpoint URL we should use. For this, we choose Edit on the proxy service popup menu, and then we click the Transport tab.

EndpointURI

We set the Endpoint URI field to /hr, which leads to a URL like this to test the department details resource:

http://host:port/hr/departments/20

This is how the result looks in Postman:

Postman

Conclusion

We have provided you with detailed step-by-step instructions on how to create two RESTFul resources that follow the design we have described in part 1 of this article series. If you are new to service bus you might be a bit overwhelmed by the number of configuration steps and dialog windows you have to go through. However, it is our experience (we started as newbies just like you not that long ago) that you very quickly get used to the sequence of steps you need to take. With the explanations, tips and recommendations in this blog post we hope to save you some valuable time and avoid some frustration which is common when learning a new product.

Learning service bus to create REST API’s is an investment that quickly pays back. Developing mobile applications, for example using Oracle’s Mobile Application Framework just becomes much easier and faster if you can use a mobile-optimized REST API, rather than complex back-end web services. And even more important, the performance of your mobile app is likely to be much better too.

In the next parts we will dive into more advanced concepts. We will implement the expandDetails query paramater, implement the other PUT and POST resources, discuss troubleshooting techniques, add error handling and will take a look at pipeline templates. Stay tuned!

Downloads:

 


Creating a Mobile-Optimized REST API Using Oracle Service Bus – Part 3

$
0
0

Introduction

To build functional and performant mobile apps, the back-end data services need to be optimized for mobile consumption. RESTful web services using JSON as payload format are widely considered as the best architectural choice for integration between mobile apps and back-end systems. At the same time, most existing enterprise back-end systems provide a SOAP-based web service application programming interface (API) or proprietary file-based interfaces. In this article series we will discuss how Oracle Service Bus (OSB) 12c can be used to transform these enterprise system interfaces into a mobile-optimized REST-JSON API. This architecture layer is sometimes referred to as Mobile Oriented Architecture (MOA) or Mobile Service Oriented Architecture (MOSOA). A-Team has been working on a number of projects with OSB 12c to build this architecture layer. We will explain step-by-step how to build this layer, and we will  share tips, lessons learned and best practices we discovered along the way.

Main Article

In part 1 we discussed the design of the REST API, in part 2 we discussed the implementation of the “read” (GET) RESTful services in service bus by transforming ADF BC SDO SOAP service methods. In this third part, we will implement the “expandDetails” query parameter in the /departments GET resource that we discussed in part 1 and we will implement the POST and PUT methods to create and update a department.

Refactoring the HR Proxy XML Schema

Before we continue with the implementation of the Rest API, it is time to revisit the XML Schema that we created for the HR REST Proxy service.The HRRestProxy.xsd contains two elements DepartmentListResponse and DepartmentDetailsResponse. Both elements contain a (partially overlapping) list of department attributes, and we need to prevent further duplication of department attributes while implementing the remainder of the HR REST API.

We can do this by specifying two complex types that hold the attributes for employee and department, and then reference these types in the element definitions rather than defining anonymous complex types within each element as we did before. Here is the refactored body of HRRestProxy.xsd now using the “named” complex types:

<xsd:element name="DepartmentListResponse">
  <xsd:complexType>
    <xsd:sequence>
      <xsd:element name="departments" type="DepartmentType" maxOccurs="unbounded"/>
    </xsd:sequence>
  </xsd:complexType>
</xsd:element>

<xsd:element name="DepartmentDetailsResponse" type="DepartmentType"/>

<xsd:complexType name="DepartmentType">
  <xsd:sequence>
    <xsd:element name="id" type="xsd:integer"/>
    <xsd:element name="name" type="xsd:string"/>
    <xsd:element name="locationId" type="xsd:integer"/>
    <xsd:element name="locationName" type="xsd:string"/>
    <xsd:element name="managerId" type="xsd:integer"/>
    <xsd:element name="managerName" type="xsd:string"/>
    <xsd:element name="employees" type="EmployeeType" maxOccurs="unbounded"/>
  </xsd:sequence>
</xsd:complexType>

<xsd:complexType name="EmployeeType">
  <xsd:sequence>
    <xsd:element name="employeeId" type="xsd:integer"/>
    <xsd:element name="firstName" type="xsd:string"/>
    <xsd:element name="lastName" type="xsd:string"/>
  </xsd:sequence>
</xsd:complexType>

We left the element names unchanged. If you have a need to change the element names as well, you need to modify the source of the corresponding XQuery transformations to return the correct element name and you need to redefine (remove and re-add) the REST operation binding since it is not possible to change the request or response element name for a REST operation binding once it has been created.

Also note that the DepartmentListResponse element now contains all department attributes as well as the child employees through the reference to DepartmentType. This is not a problem since these additional attributes are not mapped in the DepartmentListBS2PS XQuery transformation, so the /departments resource will still only return department id and name in the JSON payload.

ListMapping

Implementing the ExpandDetails Query Parameter

With the refactored XML Schema, we can now easily make a new XQuery transformation that will return all department attributes as well as the child employees when the expandDetails query parameter is set to true, The easiest way to do this is to copy the existing DepartmentListBS2PS transformation since the input and output elements are the same. So, we select the DepartmentListBS2PS.qry transformation and choose File -> Save As… from the menu to create a new file named  DepartmentListBS2PSExpanded.qry. In the visual mapper, we then can map the remaining department attributes and child employees, as shown below:

ExpandedMapping

In the HR pipeline we need to conditionally use either the  DepartmentListBS2PS.qry transformation or the DepartmentListBS2PSExpanded.qry transformation depending on the value of the expandDetails query parameter. To do this, we first store the value of this query parameter in a local variable using an Assign operation. We drag and drop the Assign operation from the component palette onto the circle just below the Request Action of the getDepartmentList branch. In the expression builder dialog, we select the expandDetails query parameter value:

ExpDetailsExpr

Note how the expression is enclosed within the data() expression to get the actual value of the XML parameter node instead of the XML node. In the Assign – Properties tab, we set the variable name to expandDetails.

ExpDetailsProps

To use the correct transformation based on the query parameter value, we drag and drop the If Then operation onto the circle just below the Response Action of the getDepartmentList branch. We move the existing Replace action (using drag and drop) which uses the simple DepartmentListBS2PS.qry transformation to the else branch. The pipeline diagram now looks like this:

ITE1

The next step is to set the Condition expression on the If – Properties tab to ‘true’=$expandDetails.

ITE2

The dollar sign is used to reference the expandDetails variable that we set using the Assign operation that we defined before.

Finally, we drag and drop a new Replace action underneath the if branch and set the Value property to use the DepartmentListBS2PSExpanded.qry transformation.

ITE3

That’s it, if we redeploy and test the resource in Postman with the expandDetails query parameter set to true, the result looks like this:

Postman2

Implementing the Create and Update Department Resources

Implementing the Create (POST) and Update (PUT) resources is straightforward, it is quite similar to the implementation of the GET resources as explained in part 2, the only difference is that this time we also need to specify an XML schema element that represents the structure of the request payload, not just the response payload. We could reuse the DepartmentDetailsResponse element for this, however that would be a somewhat confusing name, so we add another element DepartmentRequest to the HRRestProxy.xsd that we will use for the POST and PUT resources:

<xsd:element name="DepartmentRequest" type="DepartmentType"/>

We can now create a new createDepartment REST binding that uses the same /departments resource, with the HTTP Verb set to POST and the request Element name set to DepartmentRequest.

CreateDepBinding

When creating or updating REST resources it is good practice to return the full resource, allowing the consumer to directly access any attribute values that might have been set or updated server-side. So, on the Response tab, we select the DepartmentDetailsResponse element.  We repeat these steps to create an updateDepartment binding, using the same resource and same XML elements, only the HTTP Verb is set to PUT. The Edit REST Bindings dialog now looks like this:

AllRestBindings

In the HR pipeline diagram, we add two more branches for the createDepartment and updateDepartment REST operations, and add a Route Node and a Routing for each operation,wiring up both Routing elements to the mergeDepartments operation in the ADF BC SOAP Service. We could have used the separate createDepartments and updateDepartments operations from the SOAP service, but then the request and response elements in the SOAP service would have been different, preventing re-use of the XQuery transformations that we are going to make next.

PLCreateUpdate

We need two more XQuery transformations, one from the incoming proxy service holding the new or updated department element, and one that returns the new or updated department element from the business service. Since we can reuse the incoming and outgoing transformations with both operations, we name them MergeDepartmentPS2BS.qry and MergeDepartmentBS2PS.qry. The “PS2BS” version uses the DepartmentRequest element from the HRRestProxy.xsd as input parameter type and the mergeDepartments element from the appropriate business service XSD.

MergeXqry

In the visual mapper of the XQuery file we set up the SOAP mergeDepartments request body:

MergDepMapper

Likewise, we create the MergeDepartmentBS2PS.qry transformation by selecting the business service mergeDepartmentsReponse element as input and the DepartmentDetailsResponse element from the HRRestProxy schema as target element.

With the transformations in place, we are ready for the last step which is to add the Replace action in the Request Action and Response Action of both operation branches. This should feel familiar by now so we omit the detailed steps to configure these actions with the correct transformations and input parameters.

We can use the Postman REST client again to conveniently test our /departments POST resource to create a new department:

Postman3

You might wonder what happens when you try to create or update a department with an invalid payload, for example a non-existing location id. Well, by default you will get a nasty internal server error as shown below.

Postman4

In the next part of this article series we will discuss how you can handle such exceptions as we will look into troubleshooting and exception handling techniques in a broader sense.

Downloads:

Creating a Mobile-Optimized REST API Using Oracle Service Bus – Part 4

$
0
0

Introduction

To build functional and performant mobile apps, the back-end data services need to be optimized for mobile consumption. RESTful web services using JSON as payload format are widely considered as the best architectural choice for integration between mobile apps and back-end systems. At the same time, most existing enterprise back-end systems provide a SOAP-based web service application programming interface (API) or proprietary file-based interfaces. In this article series we will discuss how Oracle Service Bus (OSB) 12c can be used to transform these enterprise system interfaces into a mobile-optimized REST-JSON API. This architecture layer is sometimes referred to as Mobile Oriented Architecture (MOA) or Mobile Service Oriented Architecture (MOSOA). A-Team has been working on a number of projects with OSB 12c to build this architecture layer. We will explain step-by-step how to build this layer, and we will share tips, lessons learned and best practices we discovered along the way.

Main Article

In part 1 we discussed the design of the REST API, in part 2 and part 3 we discussed the implementation of the RESTful services in service bus by transforming ADF BC SDO SOAP service methods. In this fourth part, we will take a look at techniques for logging, debugging, troubleshooting and exception handling.

Using Service Bus Logging

The easiest way to get more insight in what actually happens inside your pipelines is to add Log actions. You can simply drag and drop a Log action from the component palette and drop it anywhere you want. For example, if a call to a business service fails, you can add log statements to print out the request body before and after the transformation that takes place in the Replace action to inspect the payload.

LogActions

By default, the Severity of the log message is set to Debug. In order to see debug log messages, you need to change the OSB log level which is set to Warning by default.  You can do this using the Actions dropdown menu in the JDeveloper log window, and choosing the Configure Oracle Diagnostic Logging option. You can also use enterprise manager, by opening the service bus dropdown menu and choose Logs -> Log Configuration.

SBLogMenu

If you set the log level to Trace (FINE) or lower. your debug log messages will in appear in the JDeveloper window log level. However, with this log level you also get a lot of standard diagnostic OSB log messages in your log which makes it harder to find your own log messages. So, it is easiest to set the Log action Severity to Info, and the OSB log level to Notification (INFO). Note that if you change the log level, you do not need to restart Weblogic or redeploy your app, the changes are applied immediately.

LogLevelInfo

With info-level logging you have a clean log window that only contains your own log messages, and when you move your OSB application to production, you will not clutter the log files as long as the production log level is set to Warning or Error. Here is an example of the log window when we execute the /departments/{departmentId} resource (which maps to the getDepartmentDetails operation binding):

LogWindow

More information about service bus logging can be found here.

Running in Debug Mode

Another way to troubleshoot issues is to run your OSB application in debug mode. You can do this by choosing the Debug option from the proxy service popup menu:

RunDebugMode

You can set breakpoints on the actions in your pipeline diagram using the popup menu:

SetBreakpoint

When you then execute a resource, the debugger will stop at your breakpoint and you can use the “data” debug window to inspect the flow of data through your pipeline.

DebugData

You can expand the various XML elements to see the contents of the header and body of your request and your response. In the above screen shot we have expanded the body element which shows the same data as we logged in the previous section. Any custom variables that you use to store temporary data, like the expandDetails variable we introduced in part 3, are also visible. When the debugger hits a breakpoint you have the normal debugging options like Step Over to go to the next action in the pipeline, or Resume to go to the next breakpoint. In other words, running in debug mode allows you determine the execution path through your pipeline in addition to viewing the data like you can with log messages,

Handling Business Service Exceptions

Invoking business services might cause various (unexpected) exceptions. The business service call might fail because the server is down, or the call succeeds but leads to an error because some business rule is violated while performing some update action. Without error handling added to our service bus application, any exception will cause a response with HTTP code 500 Internal Server Error and a meaningless OSB error code and message.

It is a good practice to use appropriate HTTP error codes depending on the type of exception that occurs. When sending a JSON payload that contains invalid data, for example a non-existent manager ID, it is common practice to return HTTP error code 400 “Bad Request”. When the business service does not respond at all, we should return HTTP error code 404 “Not Found”. Using these error codes makes clear to the consumer whether he is dealing with an application error (400), or a server error (404).

To return appropriate HTTP error codes, we first need to define an XSD that contains the structure of the error message for each type of error.Here is the error.xsd that we will  use in our example:

<?xml version = '1.0' encoding = 'UTF-8'?>
<xsd:schema xmlns:xsd="http://www.w3.org/2001/XMLSchema"
            xmlns="http://oracle.com/mobile/errors"
            targetNamespace="http://oracle.com/mobile/errors"
            elementFormDefault="qualified">
<xsd:element name="ApplicationError">
  <xsd:complexType>
    <xsd:sequence>
      <xsd:element name="code" type="xsd:string"/>
      <xsd:element name="message" type="xsd:string"/>
      <xsd:element name="severity" type="xsd:string"/>
    </xsd:sequence>
  </xsd:complexType>
</xsd:element>

<xsd:element name="ServerError">
  <xsd:complexType>
    <xsd:sequence>
      <xsd:element name="message" type="xsd:string"/>
    </xsd:sequence>
  </xsd:complexType>
</xsd:element>

</xsd:schema>

With this XSD in place we can add two fault bindings to our REST operation bindings:

FaultBindings

Each fault binding needs to have its own unique XSD element type. If we would reuse the ApplicationError element type with the 404 fault binding, the 404 error code would never be returned. OSB determines which fault binding to use based on the element type used in the fault response returned by the pipeline.

To be able to return a different payload and associated HTTP error code in case of an exception, we need to add a so-called error handler to our route nodes. We right-mouse-click on the RouteNode of the createDepartment operation branch, and choose Add Error Handler. To figure out the kind of response we get when violating a business rule, we first drag and drop a Log action inside the error handler and set the expression to $body.

CDError1

We now execute the /departments POST resource with an invalid managerId in the payload:

InvalidMgrId

In the log window we can inspect the payload body returned in case of an ADF BC exception being thrown:

InvalidMgrError

The body contains a generic part with the <env:Fault> element, and inside the <detail> element we can find the ADF-specific error. We need an XQuery file to transform the ADF error message to the ApplicationError element. As source element type we choose the ServiceErrorMessage element:

ErrorInput

As target element we choose the ApplicationError element from the error.xsd and then we can drag the mapping lines as shown below

ErrorMapping

As we have seen in the JDeveloper log window, the message element contains both the error code and the error message. Since we have a separate element for the code, we want to strip the error code from the message. We can do this by using the XQuery String function substring-after: in the component palette, we change the value of the dropdown list to XQuery Functions, and expand the String Functions accordion.We drag and drop the substring-after function onto the message mapping line, inside the Mappings area in the middle. We click on the yellow function expression icon that appears and then we can complete the expression in the Expression – Properties window.

ErrorMapping2

We should replace the second argument of the function with ‘: ‘because after these two characters the actual error message appears. Click on the XQuery Source tab to make sure that the expression has been saved correctly, sometimes the change you make in the properties window is not picked up. If this is the case, just re-enter the function argument in the source.

In the component palette there are many XQuery functions available. If you want to get more information on how to use them, you can use the xqueryfunctions.com website.

 To complete the XQuery transformation, we need to surround the ApplicationError with the standard SOAP fault element, If we forget this step, the payload will not be recognized as a valid fault payload and the fault bindings we defined for the REST operation will not be used. We cannot do this in a visual way, so we click on the XQuery Source tab and add the surrounding fault element as shown below:

ErrorMapping3

Note that the value of the <faultcode> element must be set to env:Server, otherwise it will not work. The value of the <faultstring> element doesn’t matter. With the XQuery transformation in place we can add a Replace action inside the error handler which uses this transformation:

ErrorReplace1

The expression for the ServiceErrorMessage input variable is shown below. Note the double slashes which means it will search the whole tree inside the body element, not just the direct children.

ErrorReplace

The err namespace can be found in the JDeveloper log and should be set to http://xmlns.oracle.com/adf/svc/errors/.

The last step is to drag and drop a Reply action after the Replace action and set the option With Failure to inform the proxy service that a fault response is returned.

ReplyProps

That’s it, if we now use Postman to create a new department with an invalid managerId we get a nice error response with HTTP code 400:

PostmanErr

To handle the situation where the ADF BC SOAP server is down, we need to return a response which contains the ServerError element so we can return the HTTP error code 404 together with a user-friendly error message. To distinguish between the 400 and 404 error response, we drag and drop an If-Then action inside the error handler, and enter the following expression in the Condition field:

$body//err:ServiceErrorMessage!=”

When this expression is true we are dealing with an ADF BC Exception so we should move the Replace action we already defined inside the If branch. In the Else branch we should return a generic error that the service is not available. Since there is nothing to transform, we can enter the required response payload directly in the expression field:

 <env:Fault xmlns:env="http://schemas.xmlsoap.org/soap/envelope/">
   <faultcode>env:Server</faultcode>
   <faultstring>Generic Error</faultstring>
   <detail>
     <ns2:ServerError>
       <ns2:message>The HR service is currently not available, please contact the helpdesk</ns2:message>
     </ns2:ServerError>
   </detail>
 </env:Fault>

The complete error handler (with log actions removed) now looks like this:

ErrorHandlerComplete

When we bring the ADF BC Server down and use Postman again to submit a new department, we will get the 404 error code together with the generic error we just defined in the body replace expression:

PostmanServerError

To finish the exception handling, we need to add the same error handler to the updateDepartment operation. A quick way to do this is to right-mouse-click on the createDepartment error handler and choose Copy from the popup menu. Then right-mouse-click on the updateDepartment RouteNode and choose Paste from the popup menu. However, a better and more reusable way to do this is to create a pipeline template and define the error handler in the template. This prevents duplication of identical error handlers and it allows us to change the error handler over time in the template, with the changes being picked up automatically by all pipelines based on this template. We will look into pipeline templates in more depth later on in this article series.

Downloads:

Calling Web Services in Background Using MAF 2.1

$
0
0

Introduction

Responsiveness of mobile applications is absolutely critical for user acceptance. This survey shows that the most important reason for users to discard an app after first use is the performance. When building enterprise mobile apps, the web service calls made to back-end services are often a bottleneck in achieving acceptable levels of performance. Moving these calls to a background thread, in combination with on-device data caching using the SQLite database is often an efficient and effective way of addressing these performance challenges. This article explains how you can easily execute more expensive tasks like these web service calls in the background using Oracle MAF 2.1 and at the same time provide the user with a visual indicator that a data operation is being executed in the background.

Main Article

When starting to implement web service calls and associated processing logic in the background, you need to realize that the user can continue to use the application, tap and navigate around, while triggering additional data actions, and as a result fire multiple concurrent background tasks for potentially more than one feature. Is that a problem? Yes, most likely it will cause problems if you have multiple background threads processing data simultaneously:

  • The sequence in which the background tasks are triggered might be important, for example when the result of one REST or SOAP call needs to be used, or is assumed to be processed before the next web service call takes place. With multiple concurrent background threads, this sequence cannot be guaranteed.
  • The on-device SQLite database is a single user database. It can cause errors or unexpected results when multiple background threads are writing data to the same SQLite database.

So, to guarantee correct sequencing of web service calls and avoid concurrency issues with SQLite database, it is best to execute all background tasks in one and the same thread using a queueing mechanism. You might be puzzled how to build such functionality, but the good news is that this is extremely simple in MAF 2.1 because we can now leverage Java JDK 1.8.

Let’s start with the simplest imaginable version of our BackgroundTaskExecutor class:

public class BackgroundTaskExecutor {

    private static BackgroundTaskExecutor instance;
    ExecutorService executor = Executors.newSingleThreadExecutor();

    public BackgroundTaskExecutor() {
        super();
    }

    public static synchronized BackgroundTaskExecutor getInstance() {
        if (instance == null) {
            instance = new BackgroundTaskExecutor();
        }
        return instance;
    }

    public void execute(Runnable task) {
        executor.submit(task);
    }
}

In this class we are leveraging some classes from the java.util.concurrent package that is available in Compact2 Profile of JDK 1.8 that is used in MAF 2.1. Most notably is the Executors class that contains a convenience method that creates a single tread pool with built-in queueing mechanism, exactly what we need!

To use this class, we need to call the execute method, passing in an instance of Runnable. Using the new Lambda expressions of JDK 1.8 it is easy to create this Runnable method argument. Here is a code snippet from the demo application that comes with this post that loads some departments in the background:

public class DepartmentService {

    List departments = new ArrayList();
    private transient ProviderChangeSupport providerChangeSupport = new ProviderChangeSupport(this);

    public List getDepartments() {
        return departments;
    }

    public void loadDepartments() {
        BackgroundTaskExecutor.getInstance().execute(
         () -> { 
           // mimic slow SOAP or REST call by waiting for 3 seconds
            try {
                Thread.sleep(3000);
            } catch (InterruptedException e) {
            } 
             departments.add(new Department("1","Marketing"));
             departments.add(new Department("2","Sales"));
             departments.add(new Department("3","Support"));
             providerChangeSupport.fireProviderRefresh("departments");
             AdfmfJavaUtilities.flushDataChangeEvent();
         });
    }

To further enhance the user experience, it would be nice to provide a visual indicator that tells the user that some data operation is taking place in the background. To implement this, we need to enhance our BackgoundTaskExecutor class and check whether the last task is completed.Here is the code we need to add to get a boolean EL expression that returns true when there is some task running in the background:

    private boolean running = false;
    private Future<?> lastFuture = null;

    protected void setRunning(boolean running) {
        AdfmfJavaUtilities.setELValue("#{applicationScope.bgtask_running}", running);
        AdfmfJavaUtilities.flushDataChangeEvent();
    }

    protected void updateStatus() {
        if (!running) {
            Thread t = new Thread(
            () -> {
                if (!lastFuture.isDone()) {
                    setRunning(true);
                }
                while (!lastFuture.isDone()) {
                    try {
                        // Check every 0,1 second if task is completed
                        Thread.sleep(100);
                    } catch (InterruptedException e) {
                    }
                }
                setRunning(false);
                AdfmfJavaUtilities.flushDataChangeEvent();
            });
            t.start();
        }
    }

    public synchronized void execute(Runnable task) {
        lastFuture = executor.submit(task);
        updateStatus();
    }

We have defined two additional member variables, one to keep track of the current running state, and one that holds the so-called Future of the last submitted task. A Future is an object that is returned when submitting a task, which can be used to track the progress of the task by calling the isDone() method.So, to tie it together, we update the lastFuture variable when submitting a new task, and we call the updateStatus method to ensure the UI gets notified of a running background task. The updateStatus method spawns a separate threat that runs until the task is completed. Every 0,1 second, the status of the last submitted task is checked and if it is completed the running variable of the class is set to false as well as the applicationScope variable “bgtask_running”. If a task is submitted while another one is still running, then the updateStatus method will just do nothing, the existing background thread will now automatically check this last submitted task since we updated the lastFuture variable.

With this code in place, we can now easily show some spinning icon in the user interface:

<amx:image id="i1" source="/images/reloading.gif" rendered="#{applicationScope.bgtask_running}"/>

You can download the sample application that comes with this blog post and test the functionality yourself.

Demo

The demo consists of a page with a list of departments. The list can be incrementally loaded using two buttons. As longs as at least one of the two loading tasks is running a spinning refresh icon is shown in the upper-right corner.

This article only discussed the overall technique of running tasks in the background. If you are looking for a more comprehensive solution that includes code to process REST or SOAP calls and cache results in a local SQLite database, you should check out the A-Team Mobile Persistence Accelerator. This accelerator comes as a free JDeveloper extension and contains advanced wizards and generic runtime code, that allows you to create a persistence layer that supports offline reads and writes and with data and web service operations (optionally) executed in the background. All this functionality comes out-of-the-box and requires no Java coding from your side.

Integrating with Documents Cloud using a Text Editor as an example

$
0
0

Introduction

Oracle Documents Cloud Service provides a powerful REST API and embed tools allowing integrate with almost anything today. This section will cover a web text editor reading and sending html/text content to the Documents Cloud using basic javascript and JQuery. The first example covers the basic action to load a text area and send as a text file to the server. For the second example, the web editor will be the CKEditor and we can see the following steps covered in the example:

  •  Basic Authentication
  •  Download (GET) an existing file to the editor from the Documents Cloud
  •  Change the text and save (POST) to the Documents Cloud as a new revision

This post is the first of a series of different uses of the API for document creation and manipulation. Other features such as Document Picker will be covered in the future.

Main Article

Simplifying this example, I’ve removed complex scripts and functions and using basic authentication. The web editor component, CKEditor also has the minimal plugins and only basic features are covered, which you can change later in a more complex solution.

What you need?

  •   Access to a Oracle Documents Cloud instance (https://cloud.oracle.com/documents)
  •   A Web Server to host your custom HTML file
  •   Optionally download the 3rd party editor CKEditor (Any edition) from http://ckeditor.com
    (The examples are using the CDN version)
  •   Follow the steps and have fun

Preparing the environment

Each example is a single html file that you can use a web server to host.

For this example, we will create three HTML files, the hellodocs.html to test the environment, the simpletexteditor.html with just a textarea and the texteditor.html with the 3rd party CKEditor create Rich Text and send to the Documents Cloud.

Testing the environment

Test the access to the Documents Cloud UI by entering you username, password and identify domain. The address will looks like this: https://<DoCS_server>/documents. Optionally you can also enter in the address https://<DoCS_server>/documents/api to see the response from the server.

To make sure that your environment is ready, use this code to see if you can run JQuery and you have connectivity to your Oracle Documents Cloud instance:

        $(document).ready(function(){
          $("#hellodocs").click(function(){
              var docsUrl = DoCSinstance.value + '/documents/api/1.1';
              $("#test1").text('Loading!');
              $.ajax ( {
                  type: 'GET',
                  url: docsUrl,
                  dataType: 'text',
                  beforeSend: function (xhr) {
                      xhr.setRequestHeader ('Authorization',
                        'Basic ' + btoa(DoCSuser.value + ':' + DoCSpassword.value));
                  },
                  success: function(data) {
                      $("#test1").text(data);
                      $("#editor1").val(data);
                      $("#status").text('Success');
                  },
                  error: function(jqXHR, textStatus, errorThrown) {
                      $("#status").text('ErrorMessage: '+ jqXHR.responseText);
                      $("#test1").text('Error: '+ textStatus, errorThrown);
                  }
              } );
          });
        });

 

For the JQuery REST calls you need to have the correct setup to avoid CORS issues, or else you will experience 401 error messages.

Now include the following code to test the CKEditor:

    <script src="//cdn.ckeditor.com/4.4.7/full/ckeditor.js"></script>
    <script src="//cdn.ckeditor.com/4.4.7/full/adapters/jquery.js"></script>
    <script>
      $( document ).ready( function() {
          $("#editor1").ckeditor();
      } );
    </script>

 

Full Hello World code here:

<!DOCTYPE html>
<html>
  <head>
    <meta charset="utf-8">
    <meta http-equiv="X-UA-Compatible" content="IE=edge,chrome=1">
    <title>Hello DoCS</title>
    <meta name="description" content="Hello Documents Cloud - by A-Team">
    <script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.3/jquery.min.js"></script>
    <script src="//cdn.ckeditor.com/4.4.7/full/ckeditor.js"></script>
    <script src="//cdn.ckeditor.com/4.4.7/full/adapters/jquery.js"></script>
    <script>
      $( document ).ready( function() {
          $("#editor1").ckeditor();
      } );
    </script>
  </head>
  <body>
    <script>
        $(document).ready(function(){
          $("#hellodocs").click(function(){
              var docsUrl = DoCSinstance.value + '/documents/api/1.1';
              $("#test1").text('Loading!');
              $.ajax ( {
                  type: 'GET',
                  url: docsUrl,
                  dataType: 'text',
                  beforeSend: function (xhr) {
                      xhr.setRequestHeader ('Authorization',
                        'Basic ' + btoa(DoCSuser.value + ':' + DoCSpassword.value));
                  },
                  success: function(data) {
                      $("#test1").text(data);
                      $("#editor1").val(data);
                      $("#status").text('Success');
                  },
                  error: function(jqXHR, textStatus, errorThrown) {
                      $("#status").text('ErrorMessage: '+ jqXHR.responseText);
                      $("#test1").text('Error: '+ textStatus, errorThrown);
                  }
              } );
          });
        });
      </script>
      <h2>Hello Documents Cloud</h2>
      <p>
      Username: <input id="DoCSuser" type="text" value="tenant.user">
      Password: <input id="DoCSpassword" type="password" value="userpassword">
      </p>
      DoCS Instance: <input id="DoCSinstance" type="text" size="36" value="https://tenant.documents.us2.oraclecloud.com">
      <button id="hellodocs">DoCS Test</button>
      <p id="test1">.</p>
      <p id="status">Enter your username/password and Documents Cloud Instance to test</p>
      <textarea cols="80" id="editor1" name="editor1" rows="10">
        Some Text
      </textarea>
  </body>
</html>

 

With this simple code, your page will looks like this:

Hello Documents Cloud Screenshot 

Cloud Text Editor

Only two REST calls will be used in the example, the Download File(/documents/api/1.1/files/{{file id}}/data)[GET] and the Upload File Version(/documents/api/1.1/files/{{file id}}/data)[POST].

Here you can find the code example of the text editor in the cloud integrated to the Documents Cloud:

[Example 1: simpletexteditor.html]

 

<!DOCTYPE html>
<html>
    <head>
        <meta charset="utf-8">
        <meta http-equiv="X-UA-Compatible" content="IE=edge,chrome=1">
        <title>Simple Documents Cloud Text Editor Example(Not really editor)</title>
        <meta name="description" content="Simple Documents Cloud Text Editor (Not really editor)">
        <script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.3/jquery.min.js"></script>      
    </head>
    <body>    
        <script>
            $(document).ready(function(){
                $("#btnLoadDoc").click(function(){
                    var docsUrl = DoCSinstance.value + '/documents/api/1.1';
                    var strFileId = metadataFileId.value;
                    $("#status").text('Loading!');
                    $.ajax ( {
                        type: 'GET',
                        url: docsUrl + '/files/' + strFileId + '/data',
                        crossDomain: true,
                        xhrFields: { withCredentials: true },                        
                        beforeSend: function (xhr) { 
                            xhr.setRequestHeader ('Authorization', 
                                                  'Basic ' + btoa(DoCSuser.value + ':' + DoCSpassword.value)); 
                        },
                        success: function(data) { 
                            $("#editor1").text(data);
                            $("#status").text('Document loaded');
                            $("#metadataInfo").text('');
                        },
                        error: function(jqXHR, textStatus, errorThrown) {
                            $("#status").text('ErrorMessage: '+ jqXHR.responseText);
                            $("#metadataInfo").text('Error: '+ textStatus, errorThrown);
                            
                        }
                    } ); 
                });
                $("#btnSaveDoc").click(function(){
                    var docsUrl = DoCSinstance.value + '/documents/api/1.1';
                    var strFileId = (metadataFileId.value == '' ? '' : '/' + metadataFileId.value);
                    var strFileName = metadataFilename.value;
                    $("#status").text("Saving!");
                    var fileContent = new Blob([editor1.value], { type: 'text/plain'});;
                    var filePackage = new FormData()
                    filePackage.append('jsonInputParameters','{"parentID": "self"}');
                    filePackage.append('primaryFile',fileContent, strFileName);
                    $.ajax ( {
                        type: 'POST',
                        url: docsUrl + '/files' + strFileId + '/data',
                        enctype: 'multipart/form-data',
                        data: filePackage,
                        cache: false,
                        processData: false,
                        contentType: false,
                        crossDomain: true,
                        xhrFields: { withCredentials: true },
                        beforeSend: function (xhr) { 
                            xhr.setRequestHeader ('Authorization', 
                                                  'Basic ' + btoa(DoCSuser.value + ':' + DoCSpassword.value));  
                        },
                        success: function(data) { 
                            $("#status").text('Document Saved');
                            $.each(data, function(key, value) { 
                                if(key == "version"){
                                    $("#metadataVersion").text('Version: ' + value);
                                }
                                if(key == "id"){
                                    $("#metadataFileId").val(value);
                                }
                                $("#metadataInfo").append(key + ': ' + value + '<br>');
                            });
                        },
                        error: function(jqXHR, textStatus, errorThrown) {
                            $("#status").text('ErrorMessage: '+ jqXHR.responseText);
                            $("#metadataInfo").text('Error: '+ textStatus, errorThrown);
                        }
                    } ); 
                });
            });                     
        </script>
        <h2>Simple Oracle Documents Cloud Text Editor sample</h2>
        <p>
        Username: <input id="DoCSuser" type="text" value="tenant.user">
        Password: <input id="DoCSpassword" type="password" value="userpassword">
        </p>
        Documents Cloud Address: <input id="DoCSinstance" type="text" size="50" value="https://tenant.documents.us2.oraclecloud.com">
        <p>
        File Name: <input id="metadataFilename" type="text" size="10" value="">
        File Id: <input id="metadataFileId" type="text" size="53" value="">
        <span id="metadataVersion" style="color:blue">--</span>
        </p>
        <p></p>
        <button id="btnLoadDoc">Load Text</button>
        <button id="btnSaveDoc">Save Text</button>
        <br>
        <textarea cols="80" id="editor1" name="editor1" rows="10">
            My First Documents Cloud text document
        </textarea>
        <p id="status">Enter your username/password and Documents Cloud Instance to test</p>
        <p id="metadataInfo"></p>
        <script>
            $("#metadataFilename").val('mytext' + (Math.floor((Math.random() * 1000) + 1)) + '.txt');
            $("#btnLoadDoc").prop('disabled', true);
            $('#metadataFileId').on('input', function() {
                $("#btnLoadDoc").prop('disabled', false);
            });
        </script>
    </body>
</html>

 

/ / ]] >

// ]]>

Now including the 3rd party CKEditor:

[Example 2: texteditor.html]

 

<!DOCTYPE html>
<html>
    <head>
        <meta charset="utf-8">
        <meta http-equiv="X-UA-Compatible" content="IE=edge,chrome=1">
        <title>Custom Documents Cloud Text Editor Sample - by A-Team</title>
        <meta name="description" content="Custom Documents Cloud Text Editor - by A-Team">
        <script src="https://ajax.googleapis.com/ajax/libs/jquery/2.1.3/jquery.min.js"></script>
        <script src="//cdn.ckeditor.com/4.4.7/full/ckeditor.js"></script>
        <script src="//cdn.ckeditor.com/4.4.7/full/adapters/jquery.js"></script>
        <script>
          $( document ).ready( function() {
              $("#editor1").ckeditor();
          } );
        </script>
    </head>
    <body>    
        <script>
            $(document).ready(function(){
                $("#btnLoadDoc").click(function(){
                    var docsUrl = DoCSinstance.value + '/documents/api/1.1';
                    var strFileId = metadataFileId.value;
                    $("#status").text('Loading!');
                    $.ajax ( {
                        type: 'GET',
                        url: docsUrl + '/files/' + strFileId + '/data',
                        crossDomain: true,
                        xhrFields: { withCredentials: true },                        
                        beforeSend: function (xhr) { 
                            xhr.setRequestHeader ('Authorization', 
                                                  'Basic ' + btoa(DoCSuser.value + ':' + DoCSpassword.value)); 
                        },
                        success: function(data) { 
                            $("#editor1").val(data);
                            $("#status").text('Document loaded');
                            $("#metadataInfo").text('');
                        },
                        error: function(jqXHR, textStatus, errorThrown) {
                            $("#status").text('ErrorMessage: '+ jqXHR.responseText);
                            $("#metadataInfo").text('Error: '+ textStatus, errorThrown);
                            
                        }
                    } ); 
                });
                $("#btnSaveDoc").click(function(){
                    var docsUrl = DoCSinstance.value + '/documents/api/1.1';
                    var strFileId = (metadataFileId.value == '' ? '' : '/' + metadataFileId.value);
                    var strFileName = metadataFilename.value;
                    $("#status").text("Saving!");
                    var fileContent = new Blob([$("#editor1").val()], { type: 'text/plain'});;
                    var filePackage = new FormData()
                    filePackage.append('jsonInputParameters','{"parentID": "self"}');
                    filePackage.append('primaryFile',fileContent, strFileName);
                    $.ajax ( {
                        type: 'POST',
                        url: docsUrl + '/files' + strFileId + '/data',
                        enctype: 'multipart/form-data',
                        data: filePackage,
                        cache: false,
                        processData: false,
                        contentType: false,
                        crossDomain: true,
                        xhrFields: { withCredentials: true },
                        beforeSend: function (xhr) { 
                            xhr.setRequestHeader ('Authorization', 
                                                  'Basic ' + btoa(DoCSuser.value + ':' + DoCSpassword.value));  
                        },
                        success: function(data) { 
                            $("#status").text('Document Saved');
                            $.each(data, function(key, value) { 
                                if(key == "version"){
                                    $("#metadataVersion").text('Version: ' + value);
                                }
                                if(key == "id"){
                                    $("#metadataFileId").val(value);
                                }
                                $("#metadataInfo").append(key + ': ' + value + '<br>');
                            });
                            $("#btnLoadDoc").prop('disabled', false);
                        },
                        error: function(jqXHR, textStatus, errorThrown) {
                            $("#status").text('ErrorMessage: '+ jqXHR.responseText);
                            $("#status").text('Login Error: '+ textStatus, errorThrown);
                        }
                    } ); 
                });
            });                     
        </script>
        <h2>Oracle Documents Cloud Text Editor Sample</h2>
        <p>
        Username: <input id="DoCSuser" type="text" value="tenant.user">
        Password: <input id="DoCSpassword" type="password" value="userpassword">
        </p>
        Documents Cloud Address: <input id="DoCSinstance" type="text" size="50" value="https://tenant.us2.oraclecloud.com">
        <p>
        File Name: <input id="metadataFilename" type="text" size="10" value="">
        File Id: <input id="metadataFileId" type="text" size="53" value="">
        <span id="metadataVersion" style="color:blue">--</span>
        </p>
        <p></p>
        <button id="btnLoadDoc">Load Text</button>
        <button id="btnSaveDoc">Save Text</button>
        <br>
        <textarea cols="80" id="editor1" name="editor1" rows="10">
            My First <b>Documents Cloud</b> text document
        </textarea>
        <p id="status">Enter your username/password and Documents Cloud Instance to test</p>
        <p id="metadataInfo"></p>
        <script>
            $("#metadataFilename").val('mytext' + (Math.floor((Math.random() * 1000) + 1)) + '.html');
            $("#btnLoadDoc").prop('disabled', true);
            $('#metadataFileId').on('input', function() {
                $("#btnLoadDoc").prop('disabled', false);
            });
        </script>
    </body>
</html>

 

Your page will looks like this:

Documents Cloud Text Editor Sample Screenshot

 

Simple and powerful.

 

View of the Documents Cloud UI and the articles stored with the example:

Documents Cloud UI

 

You can also use with Inline editing with multiples editable regions:

Inline Editing example

 

And other more complex coding can use multiple instances of the editor with various documents cloud documents in the same page.

 

 

Conclusion

This article covers the basis of using the REST API of Documents Cloud, with a simple example of what you can do with Oracle Documents Cloud Service. With this you can expand to use more features for a complete solution fitting your needs, like browse an image in the repository and include in the text, create new files, share your documents with others and much more.

 

Reference

Oracle Documents Cloud Service info: http://cloud.oracle.com/documents

DoCS REST API: http://docs.oracle.com/cloud/latest/documentcs_welcome/WCCCD/odcs-restapi.htm

(3rd Party) CKEditor Documentation: http://docs.ckeditor.com

Node.js – Invoking Secured REST Services in Fusion Cloud – Part 1

$
0
0

Introduction

This post focuses on invoking secured Fusion Cloud RESTFul services using Node.js. Part 1 is explicitly focused on the “GET” method. The assumption is that the reader has some basic knowledge on Node.js. Please refer to this link to download and install Node.js in your environment.

Node.js is a programming platform that allows you to execute server-side code that is similar to JavaScript in the browser. It enables real-time, two-way connections in web applications with push capability, allowing a non-blocking, event-driven I/O paradigm. It runs on a single threaded event loop and leverages asynchronous calls for various operations such as I/O. This is an evolution from stateless-web based on the stateless request-response paradigm. For example, when a request is sent to invoke a service such as REST or a database query, Node.js will continue serving the new requests. When a response comes back, it will jump back to the respective requestor. Node.js is lightweight and provides a high level of concurrency. However, it is not suitable for CPU intensive operations as it is single threaded.

Node.js is built on an event-driven, asynchronous model. The in-coming requests are non-blocking. Each request is passed off to an asynchronous callback handler. This frees up the main thread to respond to more requests.

 

Main Article

An internet media type data for RESTFul services is often JavaScript Object Notation (JSON). JSON is a lightweight data-interchange format and it is a standard way to exchange data with RESTFul services. It is not only human readable, but easy for machines to parse and generate. For more information on JSON, please refer to this link.

JSON Samples:

Simple Data Structure
Var Employee = {
“name” : “Joe Smith”
“ID” : “1234”
“email” : joe.smith@oracle.com
};
Data in Arrays
var emailGroups = [{
"email" : "email1@myCompany.com",
"name" : "Joe Smith"
},
{
"email" : " email2@myCompany.com ",
"name" : "Don Smith"
}];

 

Security

The RESTFul services in Oracle Fusion Cloud are protected with Oracle Web Service Manager (OWSM). The server policy allows the following client authentication types:

  • HTTP Basic Authentication over Secure Socket Layer (SSL)
  • Oracle Access Manager(OAM) Token-service
  • Simple and Protected GSS-API Negotiate Mechanism (SPNEGO)
  • SAML token

The client must provide one of the above policies in the security headers of the invocation call for authentication. The sample in this post is using HTTP Basic Authentication over SSL policy.

 

Node.js HTTP Get Request

In general there are two modules to invoke HTTP/s secured REST services for GET method:

  • 1. http.get() – This is a native HTTP/s API and supports only GET method.
  • 2. http.request() – The request is designed to make HTTP/s calls and supports multiple request methods such as GET, POST, PUT, DELETE, etc.

HTTP GET Module

This native API implicitly calls http.request set to GET and calls request.end() automatically. There are two parameters for this method that defines what and how the REST services is being invoked. The following snippet demonstrates the construct for HTTP/s invocation:

var client = require('https')
request = client.get(options, function(response) {…}

The above command uses HTTPS protocol. The “require” statement enables either HTTP or HTTPS protocol.

Options

The “options” is an object or string that includes the following information:

  • host –  A domain name or IP address of the server to issue the request to.
  • port – Port of remote server.
  • Path – Uniform Resource Idenitifier (URI)
  • HTTP Headers – HTTP headers that must be sent with the request such as authorization
  • Certificates – certificates such as ca cert for SSL

For more information on object “options”, refer the following link.

This a typical example of constructing ‘option’

var options = {
ca: fs.readFileSync('myCert'),
host: 'hostname.mycompany.com',
port: 443,
path: '/hcmCoreApi/atomservlet/employee/newhire',
headers: {
'Authorization': 'Basic ' + new Buffer(uname + ':' + pword).toString('base64')
}

 

function(response)

This is a callback parameter as a one time listener for the response event. It is emitted when a response is received to this request.

To get the response, add a listener for ‘response’ to the request object. The ‘response’ will be emitted from the request object when the response headers have been received. The ‘response’ event is executed with one argument which is an instance of http.IncomingMessage.

During the ‘response’ event, one can add listeners to the response object; particularly to listen for the ‘data’ event.

If no ‘response’ handler is added, then the response will be entirely discarded. However, if you add a ‘response’ event handler, then you must consume the data from the response object, either by calling response.read() whenever there is a ‘readable’ event, or by adding a ‘data’ handler, or by calling the .resume() method. Until the data is consumed, the ‘end’ event will not fire. Also, until the data is read it will consume memory that can eventually lead to a ‘process out of memory’ error.

Note: Node does not check whether Content-Length and the length of the body which has been transmitted are equal or not.

The following sample has implemented three events:

  1. 1. Data – an event to get the data
  2. 2. End – an event to know the response is completed.
  3. 3. Error – an event to capture the error and trigger your error logic handler.
request = http.get(options, function(res){
var body = "";
res.on('data', function(chunk) {
body += chunk;
});
res.on('end', function() {
console.log(body);
})
res.on('error', function(e) {
console.log("Got error: " + e.message);
});
});

 

HTTP Request Module

The request module is designed to make various HTTP/s calls such as GET, POST, PUT, DELETE, etc. The http.get() implicitly calls http.request set to GET and calls request.end() automatically. The code for GET method is identical to http.get() except for the following:

  • The http.get() is replaced with “http.request()”
  • The “option” object has a method property that defines the HTTP operation (default is GET)
  • The request.end() is explicitly called to signify the end of the request
Sample Snippet:
var options = {
ca: fs.readFileSync('myCert'),
host: 'hostname.mycompany.com',
port: 443,
method: GET,
path: '/<fusion_apps_api>/employee',
headers: {
'Authorization': 'Basic ' + new Buffer(uname + ':' + pword).toString('base64')
}
var request = http.request(options, function(res){…}
….
request.end();

 

Response Status Codes and Headers

The HTTP status code is available from response event function function(res). For example

res.statusCode

 

The HTTP response headers are available from same response event function as follows:

res.headers

 

Parsing JSON Response

The response JSON message can be parsed using JSON object. The JSON.parse() parses a string from the RESTFul services. For example:

JSON.parse(responseString)

 

Sample Code got http.get()

var uname = 'username';
var pword = 'password';
var http = require('https'),
fs = require('fs');
var options = {
ca: fs.readFileSync('MyCert'),
host: 'host.mycompany.com',
port: 10620,
path: '/<fusion_apps_api>/employee',
headers: {
'Authorization': 'Basic ' + new Buffer(uname + ':' + pword).toString('base64')
}
};
request = http.get(options, function(res){
var responseString = "";
res.on('data', function(data) {
responseString += data;
});
res.on('end', function() {
console.log(responseString);
})
res.on('error', function(e) {
console.log("Got error: " + e.message);
});
});

Sample Code for http.request()

var uname = 'username';
var pword = 'password';
var http = require('https'),
    fs = require('fs');

var options = {
    ca: fs.readFileSync('hcmcert1'),
    host: 'host.mycompany.com',
    port: 10620,
    path: '/<fusion_apps_api>/employee',
    method: 'GET',
    headers: {
     'Authorization': 'Basic ' + new Buffer(uname + ':' + pword).toString('base64')
   }         
};

var    request = http.request(options, function(res){
    console.log(res.headers);
    var responseString = '';
    res.on('data', function(chunk) {
         console.log(chunk);
         responseString += chunk;
    });
    res.on('end', function() {
        console.log(responseString);
    })
    res.on('error', function(e) {
        console.log("Got error: " + e.message);
    });
});

request.end();

Conclusion

This post demonstrates how to invoke secured Fusion Cloud REST Services using Node.js. It also provides basic introduction to JSON format and how to parse the JSON response in Node.js. The sample code is a prototype and must be further modularized for re-usability.

 

Using Oracle Documents Cloud REST API with Python Requests

$
0
0

Python Requests is a library that simplifies consuming RESTful resources from the client side. The Oracle Documents Cloud Service (DOCS) REST API fits well with Python Requests, allowing a service call to get a folder or file to be done in few lines of code. Since Python is naturally friendly to JSON responses, parsing the Response object can be done using standard syntax.

The Requests library must be installed to Python using the pip utility (or easy-install on Windows). The example below also uses the “getpass” module to handle user entry of a password.

Python Requests link: http://docs.python-requests.org/en/latest/

Python getpass module link: https://docs.python.org/2/library/getpass.html

 

A first look at using Requests with DOCS REST services is to get a user’s Personal Workspace. Notice that importing the Requests library allows calling a HTTP GET on a REST URL. All of the HTTP work is done in Python Requests and the client code need only pass in a URL, username, and password.

import requests
import getpass

docsurl='https://mydocsinstance/documents/api/1.1/folders/items'
username='peter.flies@oracle.com'

# Get password from user entry. This is using a module called "getpass"
pw = getpass.getpass("Enter password:")
response = requests.get(docsurl, auth=(username, pw))

# View the status code - should be 200. Error handling can be done with the status code. 
print (response.status_code)
# Header data is available from the Response object, which is part of Python Requests 
print (response.headers['content-type'])

# Get the JSON data
j = response.json()

# Navigate the JSON data using standard Python
print('count=' + j['count'])
print('errorCode=' + j['errorCode'])
print('totalResults=' + j['totalResults'])

print('Items:')
for item in j['items']:
	print ('type=' + item['type'] + ' name=' + item['name'] + ' owner=' + item['ownedBy']['displayName'])

 

An upload example requires a multipart HTTP POST request to send the file payload and a JSON payload. This example also shows the use of a Session in Python Requests. The Session can have the Authorization header set once and be re-used for all subsequent REST calls to Oracle Documents Cloud service. The example below uploads all files in a directory to a user’s DOCS account. A folder GUID is needed for the upload to add the new files into the target folder. Some additional lines of code are added here for printing out the amount of milliseconds that each upload takes. The upload request differs from the previous example in that the POST request needs multipart payload to succeed. Notice that a data part called “jsonInputParameters” and a file part called “primaryFile” are both added to the request. Python’s os module can handle opening the file and placing it into the multipart. A loop can grab each file from the directory and submit an upload request.

import os
import getpass
import time
import requests

docsurl='https://mydocsinstance/documents/api/1.1'
path = 'C:/TEMP/upload'
username = 'peter.flies@oracle.com'
uploadfolderid = 'F26415F66B0BE6EE53314461T0000DEFAULT00000000'

# Get user input to set the password. Set the REST client authorization header by directly passing in the username and password 
pw = getpass.getpass("Enter password:")
 
files = (file for file in os.listdir(path) 
	if os.path.isfile(os.path.join(path, file)))

# Requests has a reusable Session object. Init it with the Authorization header
s = requests.Session()
s.auth = (username, pw)

print('Uploading files from path ' + path + ' to Documents Cloud Service ' + docsurl + '.\n')
for file in files: 
	startMillis = int(round(time.time() * 1000))
	print('Uploading ' + file + '...')
	print(path + '/' + file)
	resourcePath = 'files/data'
	fullpath = path + '/' + file
	files = {
		'primaryFile': open(fullpath, 'rb')
	}
	jsondata='{\"parentID\": \"' + uploadfolderid + '\" }'
	data = {
		'jsonInputParameters': jsondata
	}
	response = s.post(docsurl + '/' + resourcePath, files=files, data=data)		
	endMillis = int(round(time.time() * 1000))
	if(response.status_code == 200 or response.status_code == 201):
		j = response.json()
		print('Upload successful. ' + str(endMillis - startMillis) + ' ms. File id: ' + j['id'] + ', version number: ' + j['version'] + '\n')
	else:
		print('ERROR: Upload unsuccessful for file: ' + file)
		print(str(response.status_code) + ' ' + response.reason )
		if(response.json()):
			j = response.json()
			print('Error details: ' + j['errorCode'] + ' ' + j['errorMessage'])			
		else:
			print('Dump of response text:')
			print(response.text + '\n')



A file download is a HTTP GET but requires saving the file to a location on disk. Again, Python has utilities to simplify the downloading and saving of Oracle Documents Cloud files.

import getpass
import requests
import logging

docsurl='https://documents.us.oracle.com/documents/api/1.1'
savePath = 'C:/TEMP/download'
username = 'peter.flies@oracle.com'
fileid = 'DCA4BBA0908AE2F497832BC2T0000DEFAULT00000000'
resourcePath = 'files/{fileid}/data'
resourcePath = resourcePath.replace('{fileid}', fileid)

# Get user input to set the password. Set the REST client authorization header by directly passing in the username and password 
pw = getpass.getpass("Enter password:")
 
# Requests has a reusable Session object. Init it with the Authorization header
s = requests.Session()
s.auth = (username, pw)

with open(savePath, 'wb') as fhandle:
	response = s.get(docsurl + '/' + resourcePath, stream=True)

	if not response.ok:
		print("Error!") #What do you want to do in case of error? 

	for filePiece in response.iter_content(1024):
		if not filePiece:
			break

	fhandle.write(filePiece)
#no JSON in response, but check it for status code, headers, etc. 
print(response.status_code)

 

For services that you may want to call frequently, a Python class can be created that wraps the Requests library. The commonly used service calls and associated parameters can have a method signature that sets the default parameters, but can also be overridden as needed. In the example class file below, the “DOCSClient” class has a constructor that initializes the Session. Once a handle to a DOCSClient is created, a method called “itemPersonalWorkspace” can be called with parameters to set the sort order, limit, and offset. This sample class has methods for only a few of the Documents Cloud Service REST calls, but the example can be applied to any DOCS REST API.

 

import requests
import logging

class DOCSClient:
	def __init__(self, docsurl, username, password):
		self.docsurl = docsurl
		self.version = '1.1'
		self.restBaseUrl = docsurl + '/api/' + self.version + '/'
		self.s = requests.Session()
		self.s.auth = (username, password)
		self.appLinkRoles = ('viewer', 'downloader', 'contributor')
		self.roles = ('viewer', 'downloader', 'contributor', 'manager')
	
	# Sample Item Resource REST method
	def itemPersonalWorkspace(self, orderby='name:asc', limit=50, offset=0):
		resourcePath = 'folders/items' + '?orderby=' + orderby + '&limit=' + str(limit) + '&offset=' + str(offset)
		return self.s.get(self.restBaseUrl + resourcePath)		
		
	# Sample Folder Resource REST methods
	def folderQuery(self, folderid):
		resourcePath = 'folders/{folderid}'
		resourcePath = resourcePath.replace('{folderid}', folderid)
		return self.s.get(self.restBaseUrl + resourcePath)		

	def folderCreate(self, folderid, name, description=''):
		resourcePath = 'folders/{folderid}'
		resourcePath = resourcePath.replace('{folderid}', folderid)
		resourcePath = resourcePath + '?name=' + name + '&description=' + description
		return self.s.post(self.restBaseUrl + resourcePath)		

	# Sample Upload Resource REST method
	def fileUpload(self, parentID, filepath):
		resourcePath = 'files/data'
		files = {
			'primaryFile': open(filepath, 'rb')
		}
		
		jsondata='{\"parentID\": \"' + parentID + '\" }'
		data = {
			'jsonInputParameters'  : jsondata
		}
		response = self.s.post(self.restBaseUrl + resourcePath, files=files, data=data)		
		return response

 

 

Lastly, using the newly created class can be done using an import statement. The class in this case was stored in a file called “oracledocsrequests.py”. That file must be in the PYTHON_PATH to be found when the code is run, and once in the path the import statement is a single line to make the script aware of the DOCSClient class. Once a client object is created, with the URL, username, and password being passed in, any of the methods can be called in a single line. A folder creation example is shown below using one of the class methods.  Note that the parameters in the DOCSClient class define description parameter as being empty by default, but the example overrides the empty string with “created from Python” as the folder description.

import getpass
from oracledocsrequests import DOCSClient

docsurl='https://documents.us.oracle.com/documents'
username='peter.flies@oracle.com'
folderid = 'F26415F66B0BE6EE53314461T0000DEFAULT00000000'

# Get user input to set the password. Set the REST client authorization header by directly passing in the username and password 
pw = getpass.getpass("Enter password:")

client = DOCSClient(docsurl, username, pw)

print('\n******* folderCreate *******')
response = client.folderCreate(folderid, 'My python folder', 'created from Python')
j = response.json()
print(response.status_code)
print('name=' + j['name'])

 

The Python Requests library slogan is “HTTP for Humans”. The powerful Requests library makes using the Oracle Documents Cloud Service REST API simple enough to write your own utilities to interact with a DOCS workspace.

 

 

 

 

 

 

Invoke Fusion Cloud Secured RESTFul Web Services

$
0
0

Introduction

The objective of this blog is to demonstrate how to invoke secured RestFul web services from Fusion Cloud using Oracle Service Oriented Architecture (SOA) as an Integration hub for real time integration with other clouds and on-premise applications. SOA could be on-premise or in the cloud (PAAS). The SOA composites deployed in on-premise SOA can be migrated to SOA in cloud.

What is REST?

REST stands for Representational State Transfer. It ignores the details of implementation and applies a set of interaction constraints. The web service APIs that adhere to the REST Architectural constraints are called RestFul. The HTTP based RESTFul APIs area defined with the following aspects:

  • Exactly one entry point – For example: http://example.com/resources/
  • Support of media type data – JavaScript Object Notation (JSON) and XML are common
  • Standard HTTP Verbs (GET, PUT, POST, PATCH or DELETE)
  • Hypertext links to reference state
  • Hypertext links to reference related resources

Resources & Collections

The Resources can be grouped into collections. Each collection is homogeneous and contains only one type of resource. For example:

URI Description Example
/api/ API Entry Point /fusionApi/resources
/api/:coll/ Top Level Collection :coll /fusionApi/resources/department
/api/:coll/:id Resource ID inside Collection /fusionApi/resources/department/10
/api/:coll/:id/:subcoll Sub-collection /fusionApi/resources/department/10/employees
/api/:coll/:id/:subcoll/:subid Sub Resource ID /fusionApi/resources/department/10/employees/1001

 

Invoking Secured RestFul Service using Service Oriented Architecture (SOA)

SOA 12c supports REST Adapter and it can be configured as a service binding component in a SOA Composite application. For more information, please refer to this link. In order to invoke a secured RestFul service, Fusion security requirements must be met. These are the following requirements:

Fusion Applications Security

All external URLs in the Oracle Fusion Cloud, for RESTful Services, are secured using Oracle Web Security Manager (OWSM). The server policy is “oracle/http_jwt_token_client_policy” that allows the following client authentication types:

  • HTTP Basic Authentication over Secure Socket Layer (SSL)
  • Oracle Access Manager(OAM) Token-service
  • Simple and Protected GSS-API Negotiate Mechanism (SPNEGO)
  • SAML token

JSON Web Token (JWT) is a light-weight implementation for web services authentication. A client having valid JWT token is allowed to call the REST service until it expires. The OWSM existing policy “oracle/wss11_saml_or_username_token_with_message_protection_service_policy” has the JWT over SSL assertion. For more information, please refer to this.

The client must provide one of the above policies in the security headers of the invocation call for authentication. In SOA, a client policy may be attached from Enterprise Manager (EM) to decouple it from the design time.

Fusion Security Roles

The user must have appropriate Fusion Roles including respective data security roles to view or change resources in Fusion Cloud. Each product pillar has respective roles. For example in HCM, a user must have any role that inherits the following roles:

  • HCM REST Services Duty – Example: “Human Capital Management Integration Specialist”
  • Data security Roles that inherit “Person Management Duty” – Example: “Human Resource Specialist – View All”

 

Design SOA Code using JDeveloper

In your SOA composite editor, right-click the Exposed Services swimlane and select Insert > REST. This action adds REST support as a service binding component to interact with the appropriate service component.

This the sample SOA Composite with REST Adapter using Mediator component (you can also use BPEL):

rest_composite

These are the following screens on how to configure RestFul Adapter as an external reference:

REST Adapter Binding

rest_adapter_config_1

REST Operation Binding

rest_adapter_config_2

REST Adapter converts JSON response to XML using Native Format Builder (NXSD). For more information on configuring NXSD from JSON to XML, please refer this link.

generic_json_to_xml_nxd

Attaching Oracle Web Service Manager (OWSM) Policy

Once the SOA composite is deployed to your SOA server, the HTTP Basic Authentication OWSM policy is attached as follows:

Navigate to your composite from EM and click on policies tab as follows:

 

rest_wsm_policy_from_EM_2

 

Identity Propagation

Once the OWSM policy is attached to your REST reference, the HTTP token can be passed using the Credential Store. Please create credential store as follows:

1. Right-Click on  SOA Domain and select Security/Credentials.

rest_credential_1

2. Please see the following screen to create a key under oracle.wsm.security map:

 

rest_credential_2

Note: If oracle.wsm.security map is missing, then create this map before creating a key.

 

By default, OWSM policy uses basic.crendial key. To use newly created key from above, the default key is override using the following instructions:

1. Navigate to REST reference binding as follows:

rest_wsm_overridepolicyconfig

rest_wsm_overridepolicyconfig_2

Replace basic.credentials with your new key value.

 

Secure Socket Layer (SSL) Configuration

In Oracle Fusion Applications, the OWSM policy mandates HTTPs protocol. For introduction to SSL and detailed configuration, please refer this link.

The cloud server certificate must be imported in two locations as follows:

1. keytool -import -alias slc08ykt -file /media/sf_C_DRIVE/JDeveloper/mywork/MyRestProject/facert.cer -keystore /oracle/xehome/app/soa12c/wlserver/server/lib/DemoTrust.jks -storepass DemoTrustKeyStorePassPhrase

This is the output:

Owner: CN=*.us.mycompany.com, DC=us, DC=mycompany, DC=com
Issuer: CN=*.us.mycompany.com, DC=us, DC=mycompany, DC=com
Serial number: 7
Valid from: Mon Apr 25 09:08:55 PDT 2011 until: Thu Apr 22 09:08:55 PDT 2021
Certificate fingerprints:
MD5: 30:0E:B4:91:F3:A4:A7:EE:67:6F:73:D3:E1:1B:A6:82
SHA1: 67:93:15:14:3E:64:74:27:32:32:26:43:FF:B8:B9:E6:05:A8:DE:49
SHA256: 01:0E:2A:8A:D3:A9:3B:A4:AE:58:4F:AD:2C:E7:BD:45:B7:97:6F:A0:C4:FA:96:A5:29:DD:77:85:3A:05:B1:B8
Signature algorithm name: MD5withRSA
Version: 1
Trust this certificate? [no]: yes
Certificate was added to keystore

2. keytool -import -alias <name> -file /media/sf_C_DRIVE/JDeveloper/mywork/MyRestPorject/facert.cer -trustcacerts -keystore /oracle/xehome/app/jdk1.7.0_55/jre/lib/security/cacerts

This is the output:

Enter keystore password:
Owner: CN=*.us.mycompany.com, DC=us, DC=mycompany, DC=com
Issuer: CN=*.us.mycompany.com, DC=us, DC=oracle, DC=com
Serial number: 7
Valid from: Mon Apr 25 09:08:55 PDT 2011 until: Thu Apr 22 09:08:55 PDT 2021
Certificate fingerprints:
MD5: 30:0E:B4:91:F3:A4:A7:EE:67:6F:73:D3:E1:1B:A6:82
SHA1: 67:93:15:14:3E:64:74:27:32:32:26:43:FF:B8:B9:E6:05:A8:DE:49
SHA256: 01:0E:2A:8A:D3:A9:3B:A4:AE:58:4F:AD:2C:E7:BD:45:B7:97:6F:A0:C4:FA:96:A5:29:DD:77:85:3A:05:B1:B8
Signature algorithm name: MD5withRSA
Version: 1
Trust this certificate? [no]: yes
Certificate was added to keystore

You must restart Admin and SOA Servers.

 

Testing

Deploy the above composite in your SOA server. The SOA composite can be invoked from EM or using tools like SOAPUI. Please see the following link to test REST adapter using HTTP Analyzer.

Conclusion

This blog demonstrates how to invoke secured REST services from Fusion Applications cloud using SOA. It provides detailed configuration on importing cloud keystores and attaching OWSM policies. This sample supports multiple patterns such as cloud-to-cloud, cloud-to-OnPremise, cloud-to-BPO, etc.

 

 

 


HCM Atom Feed Subscriber using Node.js

$
0
0

Introduction

HCM Atom feeds provide notifications of Oracle Fusion Human Capital Management (HCM) events and are tightly integrated with REST services. When an event occurs in Oracle Fusion HCM, the corresponding Atom feed is delivered automatically to the Atom server. The feed contains details of the REST resource on which the event occurred. Subscribers who consume these Atom feeds use the REST resources to retrieve additional information about the resource.

For more information on Atom, please refer to this.

This post focuses on consuming and processing HCM Atom feeds using Node.js. The assumption is that the reader has some basic knowledge on Node.js. Please refer to this link to download and install Node.js in your environment.

Node.js is a programming platform that allows you to execute server-side code that is similar to JavaScript in the browser. It enables real-time, two-way connections in web applications with push capability, allowing a non-blocking, event-driven I/O paradigm. It runs on a single threaded event loop and leverages asynchronous calls for various operations such as I/O. This is an evolution from stateless-web based on the stateless request-response paradigm. For example, when a request is sent to invoke a service such as REST or a database query, Node.js will continue serving the new requests. When a response comes back, it will jump back to the respective requestor. Node.js is lightweight and provides a high level of concurrency. However, it is not suitable for CPU intensive operations as it is single threaded.

Node.js is built on an event-driven, asynchronous model. The in-coming requests are non-blocking. Each request is passed off to an asynchronous callback handler. This frees up the main thread to respond to more requests.

For more information on Node.js, please refer this.

 

Main Article

Atom feeds enable you to keep track of any changes made to feed-enabled resources in Oracle HCM Cloud. For any updates that may be of interest for downstream applications, such as new hire, terminations, employee transfers and promotions, Oracle HCM Cloud publishes Atom feeds. Your application will be able to read these feeds and take appropriate action.

Atom Publishing Protocol (AtomPub) allows software applications to subscribe to changes that occur on REST resources through published feeds. Updates are published when changes occur to feed-enabled resources in Oracle HCM Cloud. These are the following primary Atom feeds:

Employee Feeds

New hire
Termination
Employee update

Assignment creation, update, and end date

Work Structures Feeds (Creation, update, and end date)

Organizations
Jobs
Positions
Grades
Locations

The above feeds can be consumed programmatically. In this post, Node.js is implemented as one of the solutions consuming “Employee New Hire” feeds, but design and development is similar for all the supported objects in HCM.

 

Refer my blog on how to invoke secured REST services using Node.js

Security

The RESTFul services in Oracle HCM Cloud are protected with Oracle Web Service Manager (OWSM). The server policy allows the following client authentication types:

  • HTTP Basic Authentication over Secure Socket Layer (SSL)
  • Oracle Access Manager(OAM) Token-service
  • Simple and Protected GSS-API Negotiate Mechanism (SPNEGO)
  • SAML token

The client must provide one of the above policies in the security headers of the invocation call for authentication. The sample in this post is using HTTP Basic Authentication over SSL policy.

 

Fusion Security Roles

REST and Atom Feed Roles

To use Atom feed, a user must have any HCM Cloud role that inherits the following roles:

  • “HCM REST Services and Atom Feeds Duty” – for example, Human Capital Management Integration Specialist
  • “Person Management Duty” – for example, Human Resource Specialist

REST/Atom Privileges

 

Privilege Name

Resource and Method

PER_REST_SERVICE_ACCESS_EMPLOYEES_PRIV emps ( GET, POST, PATCH)
PER_REST_SERVICE_ACCESS_WORKSTRUCTURES_PRIV grades (get)jobs (get)
jobFamilies (get)
positions (get)
locations (get)
organizations (get)
PER_ATOM_WORKSPACE_ACCESS_EMPLOYEES_PRIV employee/newhire (get)
employee/termination (get)
employee/empupdate (get)
employee/empassignment (get )
PER_ATOM_WORKSPACE_ACCESS_WORKSTRUCTURES_PRIV workstructures/grades (get)
workstructures/jobs (get)
workstructures/jobFamilies (get)
workstructures/positions (get)
workstructures/locations (get)
workstructures/organizations (get)

 

 

Atom Payload Response Structure

The Atom feed response is in XML format. Please see the following diagram to understand the feed structure:

 

AtomFeedSample_1

 

A feed can have multiple entries. The entries are ordered by “updated” timestamp of the <entry> and the first one is the latest. There are two critical elements that will provide information on how to process these entries downstream.

Content

The <content> element contains critical attributes such as Employee Number, Phone, Suffix, CitizenshipLegislation, EffectiveStartDate, Religion, PassportNumber, NationalIdentifierType, , EventDescription, LicenseNumber, EmployeeName, WorkEmail, NationalIdentifierNumber. It is in JSON format as you can see from the above diagram.

Resource Link

If data provided in the <content> is not sufficient, the RESTFul service resource link is provided to get more details. Please refer the above diagram on employee resource link for each entry. Node.js can invoke this newly created RestFul resource link.

 

Avoid Duplicate Atom Feed Entries

To avoid consuming feeds with duplicate entries, one of the following parameters must be provided to consume feeds since last polled:

1. updated-min: Returns entries within collection  Atom:updated > updated-min

Example: https://hclg-test.hcm.us2.oraclecloud.com/hcmCoreApi/Atomservlet/employee/newhire?updated-min=2015-09-16T09:16:00.000Z – Return entries published after “2015-09-16T09:16:00.000Z”.

2. updated-max: Returns entries within collection Atom:updated <=updated-max

Example: https://hclg-test.hcm.us2.oraclecloud.com/hcmCoreApi/Atomservlet/employee/newhire?updated-max=2015-09-16T09:16:00.000Z – Return entries published at/before “2015-09-16T09:16:00.000Z”.

3. updated-min=&updated-max: Return entries within collection (Atom:updated > updated-min && Atom:updated <=updated-max)

Example: https://hclg-test.hcm.us2.oraclecloud.com/hcmCoreApi/Atomservlet/employee/newhire?updated-min=2015-09-16T09:16:00.000Z&updated-max=2015-09-11T10:03:35.000Z – Return entries published between “2015-09-11T10:03:35.000Z” and “2015-09-16T09:16:00.000Z”.

Node.js Implementation

Refer my blog on how to invoke secured REST services using Node.js. These are the following things to consider when consuming feeds:

Initial Consumption

When you subscribe first time, you can invoke the resource with the query parameters to get all the published feeds or use updated-min or updated-max arguments to filter entries in a feed to begin with.

For example the invocation path could be /hcmCoreApi/Atomservlet/employee/newhire or /hcmCoreApi/Atomservlet/employee/newhire?updated-min=<some-timestamp>

After the first consumption, the “updated” element of the first entry must be persisted to use it in next call to avoid duplication. In this prototype, the “/entry/updated” timestamp value is persisted in a file.

For example:

//persist timestamp for the next call

if (i == 0) {

fs.writeFile('updateDate', updateDate[0].text, function(fserr) {

if (fserr) throw fserr; } );

}

 

Next Call

In next call, read the updated timestamp value from the above persisted file to generate the path as follows:

//Check if updateDate file exists and is not empty
try {

var lastFeedUpdateDate = fs.readFileSync('updateDate');

console.log('Last Updated Date is: ' + lastFeedUpdateDate);

} catch (e) {

// handle error

}

if (lastFeedUpdateDate.length > 0) {

pathUri = '/hcmCoreApi/Atomservlet/employee/newhire?updated-min=' + lastFeedUpdateDate;

} else {

pathUri = '/hcmCoreApi/Atomservlet/employee/newhire';

}

 

Parsing Atom Feed Response

The Atom feed response is in XML format as shown previously in the diagram. In this prototype, the “node-elementtree” package is implemented to parse the XML. You can use any library as long as the following data are extracted for each entry in the feed for downstream processing.

var et = require('elementtree');
//Request call
var request = http.get(options, function(res){
var body = "";
res.on('data', function(data) {
body += data;
});
res.on('end', function() {

//Parse Feed Response - the structure is defined in section: Atom Payload Response Structure
feed = et.parse(body);

//Identify if feed has any entries
var numberOfEntries = feed.findall('./entry/').length;

//if there are entries, extract data for downstream processing
if (numberOfEntries > 0) {
console.log('Get Content for each Entry');

//Get Data based on XPath Expression
var content = feed.findall('./entry/content/');
var entryId = feed.findall('./entry/id');
var updateDate = feed.findall('./entry/updated');

for ( var i = 0; i > content.length; i++ ) {

//get Resouce link for the respected entry
console.log(feed.findall('./entry/link/[@rel="related"]')[i].get('href'));

//get Content data of the respective entry which in JSON format
console.log(feed.findall('content.text'));
 
//persist timestamp for the next call
if (i == 0) {
  fs.writeFile('updateDate', updateDate[0].text, function(fserr) {
  if (fserr) throw fserr; } );

}

One and Only One Entry

Each entry in an Atom feed has a unique ID. For example: <id>Atomservlet:newhire:EMP300000005960615</id>

In target applications, this ID can be used as one of the keys or lookups to prevent reprocessing. The logic can be implemented in your downstream applications or in the integration space to avoid duplication.

 

Downstream Processing Pattern

The node.js scheduler can be implemented to consume feeds periodically. Once the message is parsed, there are several patterns to support various use cases. In addition, you could have multiple subscribers such as Employee new hire, Employee termination, locations, jobs, positions, etc. For guaranteed transactions, each feed entry can be published in Messaging cloud or Oracle Database to stage all the feeds. This pattern will provide global transaction and recovery when downstream applications are not available or throws error. The following diagram shows the high level architecture:

nodejs_soa_atom_pattern

 

Conclusion

This post demonstrates how to consume HCM Atom feeds and process it for downstream applications. It provides details on how to consume new feeds (avoid duplication) since last polled. Finally it provides an enterprise integration pattern from consuming feeds to downstream applications processing.

 

Sample Prototype Code

var et = require('elementtree');

var uname = 'username';
var pword = 'password';
var http = require('https'),
fs = require('fs');

var XML = et.XML;
var ElementTree = et.ElementTree;
var element = et.Element;
var subElement = et.SubElement;

var lastFeedUpdateDate = '';
var pathUri = '';

//Check if updateDate file exists and is not empty
try {
var lastFeedUpdateDate = fs.readFileSync('updateDate');
console.log('Last Updated Date is: ' + lastFeedUpdateDate);
} catch (e) {
// add error logic
}

//get last feed updated date to get entries since that date
if (lastFeedUpdateDate.length > 0) {
pathUri = '/hcmCoreApi/atomservlet/employee/newhire?updated-min=' + lastFeedUpdateDate;
} else {
pathUri = '/hcmCoreApi/atomservlet/employee/newhire';
}

// Generate Request Options
var options = {
ca: fs.readFileSync('HCM Cert'), //get HCM Cloud certificate - either through openssl or export from web browser
host: 'HCMHostname',
port: 443,
path: pathUri,
"rejectUnauthorized" : false,
headers: {
'Authorization': 'Basic ' + new Buffer(uname + ':' + pword).toString('base64')
}
};

//Invoke REST resource for Employee New Hires
var request = http.get(options, function(res){
var body = "";
res.on('data', function(data) {
body += data;
});
res.on('end', function() {

//Parse Atom Payload response 
feed = et.parse(body);

//Get Entries count
var numberOfEntries = feed.findall('./entry/').length;

console.log('...................Feed Extracted.....................');
console.log('Numer of Entries: ' + numberOfEntries);

//Process each entry
if (numberOfEntries > 0) {

console.log('Get Content for each Entry');

var content = feed.findall('./entry/content/');
var entryId = feed.findall('./entry/id');
var updateDate = feed.findall('./entry/updated');

for ( var i = 0; i < content.length; i++ ) {
console.log(feed.findall('./entry/link/[@rel="related"]')[i].get('href'));
console.log(feed.findall('content.text'));

//persist timestamp for the next call
if (i == 0) {
fs.writeFile('updateDate', updateDate[0].text, function(fserr) {
if (fserr) throw fserr; } );
}

fs.writeFile(entryId[i].text,content[i].text, function(fserr) {
if (fserr) throw fserr; } );
}
}

})
res.on('error', function(e) {
console.log("Got error: " + e.message);
});
});

 

 

HCM Atom Feed Subscriber using SOA Cloud Service

$
0
0

Introduction

HCM Atom feeds provide notifications of Oracle Fusion Human Capital Management (HCM) events and are tightly integrated with REST services. When an event occurs in Oracle Fusion HCM, the corresponding Atom feed is delivered automatically to the Atom server. The feed contains details of the REST resource on which the event occurred. Subscribers who consume these Atom feeds use the REST resources to retrieve additional information about the resource.

For more information on Atom, please refer to this.

This post focuses on consuming and processing HCM Atom feeds using Oracle Service Oriented Architecture (SOA) Cloud Service. Oracle SOA Cloud Service provides a PaaS computing platform solution for running Oracle SOA Suite, Oracle Service Bus, and Oracle API Manager in the cloud. For more information on SOA Cloud Service, please refer this.

Oracle SOA is the industry’s most complete and unified application integration and SOA solution. It transforms complex application integration into agile and re-usable service-based connectivity to speed time to market, respond faster to business requirements, and lower costs.. SOA facilitates the development of enterprise applications as modular business web services that can be easily integrated and reused, creating a truly flexible, adaptable IT infrastructure.

For more information on getting started with Oracle SOA, please refer this. For developing SOA applications using SOA Suite, please refer this.

 

Main Article

Atom feeds enable you to keep track of any changes made to feed-enabled resources in Oracle HCM Cloud. For any updates that may be of interest for downstream applications, such as new hire, terminations, employee transfers and promotions, Oracle HCM Cloud publishes Atom feeds. Your application will be able to read these feeds and take appropriate action.

Atom Publishing Protocol (AtomPub) allows software applications to subscribe to changes that occur on REST resources through published feeds. Updates are published when changes occur to feed-enabled resources in Oracle HCM Cloud. These are the following primary Atom feeds:

Employee Feeds

New hire
Termination
Employee update

Assignment creation, update, and end date

Work Structures Feeds (Creation, update, and end date)

Organizations
Jobs
Positions
Grades
Locations

The above feeds can be consumed programmatically. In this post, Node.js is implemented as one of the solutions consuming “Employee New Hire” feeds, but design and development is similar for all the supported objects in HCM.

 

HCM Atom Introduction

For Atom “security, roles and privileges”, please refer my blog HCM Atom Feed Subscriber using Node.js.

 

Atom Feed Response Template

 

AtomFeedSample_1

SOA Cloud Service Implementation

Refer my blog on how to invoke secured REST services using SOA. The following diagram shows the patterns to subscribe to HCM Atom feeds and process it to downstream applications that may have either web services or file based interfaces. Optionally, all entries from the feeds could be staged either in database or messaging cloud before processing it during events such as downstream application is not available or throwing system errors. This provides the ability to consume the feeds, but hold the processing until downstream applications are available. Enterprise Scheduler Service (ESS), a component of SOA Suite, is leveraged to invoke the subscriber composite periodically.

 

soacs_atom_pattern

The following diagram shows the implementation of the above pattern for Employee New Hire:

soacs_atom_composite

 

Feed Invocation from SOA

HCM cloud feed though in XML representation, the media type of the payload response is “application/atom+xml”. This media type is not supported at this time, but use the following java embedded activity in your BPEL component:

Once the built-in REST Adapter supports the Atom media type, java embedded activity will be replaced and further simplify the solution.

try {

String url = "https://mycompany.oraclecloud.com";
String lastEntryTS = (String)getVariableData("LastEntryTS");
String uri = "/hcmCoreApi/atomservlet/employee/newhire";

//Generate URI based on last entry timestamp from previous invocation
if (!(lastEntryTS.isEmpty())) {
uri = uri + "?updated-min=" + lastEntryTS;
}

java.net.URL obj = new URL(null,url+uri, new sun.net.www.protocol.https.Handler());

javax.net.ssl.HttpsURLConnection conn = (HttpsURLConnection) obj.openConnection();
conn.setRequestProperty("Content-Type", "application/vnd.oracle.adf.resource+json");
conn.setDoOutput(true);
conn.setRequestMethod("GET");

String userpass = "username" + ":" + "password";
String basicAuth = "Basic " + javax.xml.bind.DatatypeConverter.printBase64Binary(userpass.getBytes("UTF-8"));
conn.setRequestProperty ("Authorization", basicAuth);

String response="";
int responseCode=conn.getResponseCode();
System.out.println("Response Code is: " + responseCode);

if (responseCode == HttpsURLConnection.HTTP_OK) {

BufferedReader reader = new BufferedReader(new InputStreamReader(conn.getInputStream()));

String line;
String contents = "";

while ((line = reader.readLine()) != null) {
contents += line;
}

setVariableData("outputVariable", "payload", "/client:processResponse/client:result", contents);

reader.close();

}

} catch (Exception e) {
e.printStackTrace();
}

 

These are the following things to consider when consuming feeds:

Initial Consumption

When you subscribe first time, you can invoke the resource with the query parameters to get all the published feeds or use updated-min or updated-max arguments to filter entries in a feed to begin with.

For example the invocation path could be /hcmCoreApi/Atomservlet/employee/newhire or /hcmCoreApi/Atomservlet/employee/newhire?updated-min=<some-timestamp>

After the first consumption, the “updated” element of the first entry must be persisted to use it in next call to avoid duplication. In this prototype, the “/entry/updated” timestamp value is persisted in a database cloud (DbaaS).

This is the sample database table

create table atomsub (
id number,
feed_ts varchar2(100) );

For initial consumption, keep the table empty or add a row with the value of feed_ts to consume initial feeds. For example, the feed_ts value could be “2015-09-16T09:16:00.000Z” to get all the feeds after this timestamp.

In SOA composite, you will update the above table to persist the “/entry/updated” timestamp in the feed_ts column of the “atomsub” table.

 

Next Call

In next call, read the updated timestamp value from the database and generate the URI path as follows:

String uri = "/hcmCoreApi/atomservlet/employee/newhire";
String lastEntryTS = (String)getVariableData("LastEntryTS");
if (!(lastEntryTS.isEmpty())) {
uri = uri + "?updated-min=" + lastEntryTS;
}

The above step is done in java embedded activity, but it could be done in SOA using <assign> expressions.

Parsing Atom Feed Response

The Atom feed response is in XML format as shown previously in the diagram. In this prototype, the feed response is stored in output variable as a string. The following expression in <assign> activity will convert it to XML

oraext:parseXML($outputVariable.payload/client:result)


Parsing Each Atom Entry for Downstream Processing

Each entry has two major elements as mentioned in Atom response payload structure.

Resource Link

This contains the REST employee resource link to get Employee object. This is a typical REST invocation from SOA using REST Adapter. For more information on invoking REST services from SOA, please refer my blog.

 

Content Type

This contains selected resource data in JSON format. For example: “{  “Context” : [ {    "EmployeeNumber" : "212",    "PersonId" : "300000006013981",    "EffectiveStartDate" : "2015-10-08",    "EffectiveDate" : "2015-10-08",    "WorkEmail" : "phil.davey@mycompany.com",    "EmployeeName" : "Davey, Phillip"  } ]}”.

In order to use above data, it must be converted to XML. The BPEL component provides a Translator activity to transform JSON to XML. Please refer the SOA Development document, section B1.8 – doTranslateFromNative.

 

The <Translate> activity syntax to convert above JSON string from <content> is as follows:

<assign name="TranslateJSON">
<bpelx:annotation>
<bpelx:pattern>translate</bpelx:pattern>
</bpelx:annotation>
<copy>
 <from>ora:doTranslateFromNative(string($FeedVariable.payload/ns1:entry/ns1:content), 'Schemas/JsonToXml.xsd', 'Root-Element', 'DOM')</from>
 <to>$JsonToXml_OutputVar_1</to>
 </copy>
</assign>

This is the output:

jsonToXmlOutput

The following provides detailed steps on how to use Native Format Builder in JDeveloper:

In native format builder, select JSON format and use above <content> as a sample to generate a schema. Please see the following diagrams:

JSON_nxsd_1JSON_nxsd_2JSON_nxsd_3

JSON_nxsd_5

 

One and Only One Entry

Each entry in an Atom feed has a unique ID. For example: <id>Atomservlet:newhire:EMP300000005960615</id>

In target applications, this ID can be used as one of the keys or lookups to prevent reprocessing. The logic can be implemented in your downstream applications or in the integration space to avoid duplication.

 

Scheduler and Downstream Processing

Oracle Enterprise Scheduler Service (ESS) is configured to invoke the above composite periodically. At present, SOA cloud service is not provisioned with ESS, but refer this to extend your domain. Once the feed response message is parsed, you can process it to downstream applications based on your requirements or use cases. For guaranteed transactions, each feed entry can be published in Messaging cloud or Oracle Database to stage all the feeds. This will provide global transaction and recovery when downstream applications are not available or throws error.

The following diagram shows how to create job definition for a SOA composite. For more information on ESS, please refer this.

ess_3

SOA Cloud Service Instance Flows

First invocation without updated-min argument to get all the feeds

 

soacs_atom_instance_json

Atom Feed Response from above instance

AtomFeedResponse_1

 

Next invocation with updated-min argument based on last entry timestamp

soacs_atom_instance_noentries

 

Conclusion

This post demonstrates how to consume HCM Atom feeds and process it for downstream applications. It provides details on how to consume new feeds (avoid duplication) since last polled. Finally it provides an enterprise integration pattern from consuming feeds to downstream applications processing.

 

Sample Prototype Code

The sample prototype code is available here.

 

soacs_atom_composite_1

 

 

Exploring OAM’s SAML Identity Assertion

$
0
0

Introduction

OAM (Oracle Access Manager) has an interesting feature that often goes unnoticed to a considerable number of people wishing to tackle the problem of identity propagation. It’s OAM’s ability to generate a secure token embedding user information as a result of successful authentication or authorization. My colleagues Rob Otto and Simon Kissane have talked about it in “Retrieving the OAM SessionID for Fun and Profit” and “Authenticating to OIM SCIM server using an OAM-generated SAML identity assertion”.

Motivated by a recent customer inquiry, on this post I want to talk about such ability from the perspective of browser-based clients invoking REST services, a very common pattern these days. Imagine, for instance, that the end user identity must be securely propagated from an AngularJS-based application to REST services deployed on Weblogic server or JBoss. While there are a few solution options to consider, here I want to focus on this pre-built OAM feature that requires near-to-zero implementation effort. Additionally, I want to say a few words on how to customize that secure token by adding extra information to be eventually consumed by downstream resources.

The use case in question is as simple as this:

Use Case

Use Case

In the context of Oracle Fusion Middleware, the Identity Assertion feature usage is covered in Using Identity Context chapter of OAM’s Administrator’s Guide.

The Identity Assertion

A protected resource in OAM is associated with an Authentication policy and, optionally, with an Authorization policy.
In OAM admin console, if you look at the Response tab of either Authentication or Authorization policy, there’s a check box named “Identity Assertion”.

Identity Assertion

Identity Assertion

Marking the checkbox makes OAM server issuing a SAML assertion as a result of successful authentication and/or authorization. OAM then adds the assertion as an HTTP response header named “OAM_IDENTITY_ASSERTION” back to the requesting Webgate. This is a very important aspect to understand: the response header is NOT sent back to the browser, which is actually something very welcome, because it makes it very hard for any token hijack attempt. With the request authorized, the Webgate turns the response header into a request header to be forwarded to the downstream resource being invoked by the HTTP Server.

Pretty much the same process that natively happens with OAM_REMOTE_USER HTTP header, who already has the end user identity. So, why bother with OAM_IDENTITY_ASSERTION? For basically three reasons (1 and 2 actually being consequences of 3):

1) it’s safer, because it’s digitally signed by OAM server;
2) it can convey much more information than a simple user id;
3) it’s a standard SAML assertion, therefore, interoperable.

Here’s how it looks like:

<saml:Assertion xmlns:saml="urn:oasis:names:tc:SAML:2.0:assertion" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xs="http://www.w3.org/2001/XMLSchema" Version="2.0" ID="fd53fa85-4646-41e3-9d4b-e95bc3c56b33" IssueInstant="2016-03-31T12:49:06Z">
	<saml:Issuer>OAM User Assertion Issuer</saml:Issuer>
	<dsig:Signature xmlns:dsig="http://www.w3.org/2000/09/xmldsig#">
		<dsig:SignedInfo>
			<dsig:CanonicalizationMethod Algorithm="http://www.w3.org/2001/10/xml-exc-c14n#"/>
			<dsig:SignatureMethod Algorithm="http://www.w3.org/2000/09/xmldsig#rsa-sha1"/>
			<dsig:Reference URI="#fd53fa85-4646-41e3-9d4b-e95bc3c56b33">
				<dsig:Transforms>
					<dsig:Transform Algorithm="http://www.w3.org/2000/09/xmldsig#enveloped-signature"/>
					<dsig:Transform Algorithm="http://www.w3.org/2001/10/xml-exc-c14n#"/>
				</dsig:Transforms>
				<dsig:DigestMethod Algorithm="http://www.w3.org/2000/09/xmldsig#sha1"/>
				<dsig:DigestValue>KWCLd8vBg0KW5fBHD+D7ALxLsh4=</dsig:DigestValue>
			</dsig:Reference>
		</dsig:SignedInfo>
		<dsig:SignatureValue>R9fNdqSqiTQaG5gDDjv5Gue3ziZPNUfLgcT880ViUDiN3HcCpKLJ1L2PIKfQgMIjajZXO/PN/j+IC8SlmBeRZ/bI9BmHF9skqI2A+Q0+uJfgqnyw+Fy/nIPGGraTK3AVsivv5j5tkdeDVJ+dBUfBT+Gf6A/onVp7YSwpAQ48psg=</dsig:SignatureValue>
		<dsig:KeyInfo>
			<dsig:X509Data>
				<dsig:X509Certificate>MIIBxzCCATACAWYwDQYJKoZIhvcNAQEEBQAwLDEqMCgGA1UEAxMhT0FNIFVzZXIgQXNzZXJ0aW9uIElzc3VlciBDQSBSb290MB4XDTE1MTAxMzE1MTIwMVoXDTI0MDMyMjE1MTIwMVowLDEqMCgGA1UEAxMhT0FNIFVzZXIgQXNzZXJ0aW9uIElzc3VlciBDQSBSb290MIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQCX1C6Qrk42DsLD0QC4mx9U0kyl2MD6K1qu13N9qqv/xYHi2nmM6h/M8frFP0Czngjlm7gHzgHDRVLkMBxEiOOOpChOnygF0OhdrmeziwUNd2VxjKf8pDU17YYR06lwj4ad702Z4dFmz+rsBX/MPap8XzfwOa6Dj1DPa/5xC7buswIDAQABMA0GCSqGSIb3DQEBBAUAA4GBADCM5s2fUm4lHenm3BlRwq8JVjj6D31DWKuN4qjMKY1vHluqmfexjofzs2PtAk/4bwZN4DIKJg6qVTs5YqStlGcvDsaBsSJoxEmPOJ8PF7jdDP1bxZfxfz6AajthA4fMfwPfVDu++VGEBZ9AYBc7f9tskIDN/TVyntQlWD1he9Ru</dsig:X509Certificate>
				<dsig:X509IssuerSerial>
					<dsig:X509IssuerName>CN=OAM User Assertion Issuer CA Root</dsig:X509IssuerName>
					<dsig:X509SerialNumber>102</dsig:X509SerialNumber>
				</dsig:X509IssuerSerial>
				<dsig:X509SubjectName>CN=OAM User Assertion Issuer CA Root</dsig:X509SubjectName>
			</dsig:X509Data>
		</dsig:KeyInfo>
	</dsig:Signature>
	<saml:Subject>
		<saml:NameID NameQualifier="oud_slc09iug" Format="urn:oasis:names:tc:SAML:1.1:nameid-format:X509SubjectName" SPProvidedID="SADMIN">uid=SADMIN,cn=Users,dc=oracle,dc=com</saml:NameID>
		<saml:SubjectConfirmation Method="urn:oasis:names:tc:SAML:2.0:cm:bearer">
			<saml:SubjectConfirmationData Address="10.88.248.71"/>
		</saml:SubjectConfirmation>
	</saml:Subject>
	<saml:Conditions NotBefore="2016-03-31T12:49:06Z" NotOnOrAfter="2016-03-31T20:49:06Z"/>
	<saml:AuthnStatement AuthnInstant="2016-03-31T12:49:00Z">
		<saml:AuthnContext>
			<saml:AuthnContextClassRef>urn:oasis:names:tc:SAML:2.0:ac:classes:Unspecified</saml:AuthnContextClassRef>
		</saml:AuthnContext>
	</saml:AuthnStatement>
	<saml:AttributeStatement>
		<saml:Attribute NameFormat="urn:oasis:names:tc:SAML:2.0:attrname-format:uri" Name="urn:oasis:names:tc:SAML:2.0:profiles:session:sessionId">
			<saml:AttributeValue xsi:type="xs:string">a316e3d4-0a54-4c28-a398-694dad853b1a|4QCSd0VILGDCLvqf5WH+l566Mbk=</saml:AttributeValue>
		</saml:Attribute>
		<saml:Attribute NameFormat="urn:oasis:names:tc:SAML:2.0:attrname-format:uri" Name="urn:oasis:names:tc:SAML:2.0:profiles:session:authenticationStrength">
			<saml:AttributeValue xsi:type="xs:integer">2</saml:AttributeValue>
		</saml:Attribute>
		<saml:Attribute NameFormat="urn:oasis:names:tc:SAML:2.0:attrname-format:uri" Name="urn:oasis:names:tc:SAML:2.0:profiles:session:timeLastActive">
			<saml:AttributeValue xsi:type="xs:dateTime">2016-03-31T12:49:06Z</saml:AttributeValue>
		</saml:Attribute>
		<saml:Attribute NameFormat="urn:oasis:names:tc:SAML:2.0:attrname-format:uri" Name="urn:oasis:names:tc:SAML:2.0:profiles:session:tokenFormatVersion">
			<saml:AttributeValue xsi:type="xs:string">1.0</saml:AttributeValue>
		</saml:Attribute>
		<saml:Attribute NameFormat="urn:oasis:names:tc:SAML:2.0:attrname-format:basic" Name="oracle:idm:claims:ids:attributes">
			<saml:AttributeValue xsi:type="xs:string">email=sadmin@oracle.com</saml:AttributeValue>
		</saml:Attribute>
		<saml:Attribute NameFormat="urn:oasis:names:tc:SAML:2.0:attrname-format:basic" Name="oracle:idm:claims:tenant:name">
			<saml:AttributeValue xsi:type="xs:string">siebel</saml:AttributeValue>
		</saml:Attribute>
	</saml:AttributeStatement>
</saml:Assertion>

A few important aspects to notice when looking at the Identity Assertion are listed below. All these are valuable information for the component/agent that is going to validate the assertion or make authorization decisions based on user information right before letting the request hit the invoked resource.

1) The SAML issuer: the value always is “OAM User Assertion Issuer”.

2) The Digital Signature, signed with OAM server private key. A required task in verifying the assertion is veryfing the signature. As such, OAM server public key must be exported and made available to the validating agent. Execute the following steps for exporting the public key (courtesy of my colleague Simon Kissane):

a) Find out what your .oamkeystore password is:
Start $OAM_ORACLE_HOME/common/bin/wlst.sh

> cd $OAM_ORACLE_HOME/common/bin
> ./wlst.sh
> connect();
<enter username and password>
> print(mbs.invoke(ObjectName('com.oracle.jps:type=JpsCredentialStore'),"getPortableCredential",["OAM_STORE","jks"],["java.lang.String","java.lang.String"]).get("password"));
<copy password output>
> exit();

b) To export the assertion-cert certificate from $OAM_DOMAIN_HOME/config/fmwconfig/.oamkeystore, run the following command:

> export OAM_DOMAIN_HOME=<path_to_your_oam_domain_folder>
> cd $JAVA_HOME/bin
> ./keytool -exportcert -keystore $OAM_DOMAIN_HOME/config/fmwconfig/.oamkeystore -storetype JCEKS -storepass <password_obtained_in_step_a> -alias assertion-cert -file /tmp/assertion-cert.cer

3) The SAML Subject NameID: contains the user’s DN (Distinguished Name) in the underlying LDAP server. If the validating component/agent needs to establish a user context, this is the identity to parse and assert.

4) The SAML AttributeStatement: containing both OAM session information and custom user attributes. Notice the element

<saml:Attribute NameFormat="urn:oasis:names:tc:SAML:2.0:attrname-format:uri" Name="urn:oasis:names:tc:SAML:2.0:profiles:session:sessionId">
     <saml:AttributeValue xsi:type="xs:string">a316e3d4-0a54-4c28-a398-694dad853b1a|4QCSd0VILGDCLvqf5WH+l566Mbk=</saml:AttributeValue>
</saml:Attribute>

As pointed by Rob Otto on his post, the string a316e3d4-0a54-4c28-a398-694dad853b1a|4QCSd0VILGDCLvqf5WH+l566Mbk= corresponds to the user session id in the OAM server. It could be used by the validating component/agent, for instance, to cross check the identity assertion with a valid OAM session, although I wouldn’t recommended it to be used without considering the security requirements of the service in question being accessed. Reason is performance, since that would involve a remote call back to OAM server on every service access. Verifying the digital signature seems good enough for the whole majority of services, since a compromised OAM’s private key means a deep serious disaster already. But if for some reason we really need to cross check the session id, one idea is doing it selectively, based on some custom attribute in the identity assertion itself.

As mentioned, extra information can be added to the identity assertion. That’s also done in OAM console’s Policy Response tab, by using the “Asserted Attribute” response type and adding the user attribute name in the underlying LDAP server to $user.attr. The screen shot below shows the adding of the user “mail” attribute.

Customizing the Identity Assertion

Customizing the Identity Assertion

That makes up for the following Attribute element in the assertion:

<saml:Attribute NameFormat="urn:oasis:names:tc:SAML:2.0:attrname-format:basic" Name="oracle:idm:claims:ids:attributes">
    <saml:AttributeValue xsi:type="xs:string">email=sadmin@oracle.com</saml:AttributeValue>
</saml:Attribute>

For an overall discussion on Policy Responses, look at Introduction to Policy Responses for SSO.

 

Securing the Application

Now that we have good grasp on the identity assertion feature, integrating it into our use case scenario becomes simpler.

A JavaScript-based client web application is typically hosted by an HTML page. Executed by the web browser, it makes AJAX-like calls to REST services. We want to protect the services and the HTML page itself with OAM. The user typically authenticates to OAM when requesting the HTML page. The AJAX-like calls also go through the Webgate, when an authorization policy is triggered and the identity assertion is issued back to the Webgate, that adds it to the outgoing request to the service endpoint. From OAM’s perspective, the point to consider here is that a REST service is nothing different than a traditional web-based resource typically secured by OAM. The diagram below illustrates the REST service invocation.

 

Use Case Implementation with OAM

Use Case Implementation with OAM

From an OAM policy standpoint, there are three resources to protect. The HTML page along with the JavaScript resource and the services. Since we’re only interested in generating the assertion when invoking the service, I’d have two separate authorization policies: one for the HTML and the JavaScript, with no assertion generation; and another one for the services, this time marking the “Identity Assertion” checkbox.

Let me reinforce the importance of protecting the hosting HTML page in the first place. That’s an absolute must, or an unauthenticated request to the REST service will disrupt the user experience, because the redirects during login time will bypass what you intended to provide with your nice and slick JavaScript code. Another aspect worth mentioning is regarding to OAM session timeouts when a REST call is made. The JavaScript must be able to handle such event, possibly sending a request to the hosting HTML page, so that the whole process starts over and the user experience is preserved.

 

Sample Code and Configuration

I’ve done a small implementation myself. I’ve deployed my REST service on JBoss EAP and used OHS as my HTTP server. If you’ve followed along, understanding the artifacts in this section is straightforward. Please, notice that JBoss has no pre-built agent for validating the identity assertion. As you can see by looking at the REST Service Implementation below, it simply prints the OAM_IDENTITY_ASSERTION request header, just to confirm it indeed comes in. Writing an agent and integrating it with JBoss Login Module is the custom work necessary for this specific scenario. Important to mention this is not the case of Weblogic server in Oracle Fusion Middleware, where you have OWSM to the rescue.

HTML

<html ng-app="partsInvApp">
  <head>
    <meta charset="utf-8">
    <title>Parts Inventory Application</title>

    <link href="bootstrap/css/bootstrap.min.css" rel="stylesheet">

    <script src="https://ajax.googleapis.com/ajax/libs/angularjs/1.4.4/angular.min.js"></script>
    <script src="https://ajax.googleapis.com/ajax/libs/angularjs/1.4.4/angular-cookies.js"></script>
    <script src="./partsInvApp.js"></script>
  </head>
  <body>
<!--
    <p ng-controller="userProfileController">  Welcome <b>{{firstName}} {{lastName}}</b>, check out our inventory list</p>
-->
    <div style="width:600px" width="600" class="table-responsive" ng-controller="partsInvController">
      <table class="table table-striped" style="width:600px" width="600">
        <thead>
          <tr>
            <th width="15%">Id</th>
            <th width="15%">Name</th>
            <th width="40%">Description</th>
            <th width="15%">Price</th>
            <th width="15%"> </th>
          </tr>
        </thead>  
        <tbody>
          <form name="orderForm">
          <tr ng-repeat="part in parts">
          	<td width="15%">{{part.uniqueid}}</td>
            <td width="15%">{{part.name}}</td>
            <td width="40%">{{part.desc}}</td>
            <td width="15%">{{part.price}}</td>
            <td width="15%" valign="top">
             <!-- <form name="orderForm"> -->
                    <input type="hidden" name="partid" value="{{part.uniqueid}}" ng-model="part.uniqueid">
                    <button type="submit" class="btn btn-sm btn-primary"
                      ng-click="orderPart(part)"
                      ng-disabled="orderForm.$invalid">Order</button>
             <!-- </form> -->
            </td>  
          </tr>
        </form>
        </tbody>  
      </table>
      <h4 align="center" ng-if="submitResult">
        <span class="label label-success">
          {{submitResult}}
        </span>  
      </h4>  
    </div>  
  </body>
</html>

AngularJS

var partsInvApp = angular.module('partsInvApp', []).config(['$httpProvider', function($httpProvider) {
    $httpProvider.defaults.withCredentials = true;
  }]);

partsInvApp.controller('partsInvController', function ($scope, $http){
      $http.get('http://den00hdo.us.oracle.com:7777/services/parts').success(function(data) {
    	//$http.get('services/partsinventory/parts').success(function(data) {
    	$scope.parts = data.result;
    });

    $scope.orderPart = function(part) {

		var config = {
        	params: {
          		part: part
        	}
      	};
      	console.log(config.params.part.uniqueid);
      	$scope.submitResult = "Order successfully placed for part id " + config.params.part.uniqueid;
    };   	
});

http://den00hdo.us.oracle.com:777/services/parts points to the Oracle HTTP Server, who in turn forwards the request to the backend service. See below.

 

Oracle HTTP Server mod_proxy Rule

This is within OHS httpd.conf.

NameVirtualHost *.7777
<VirtualHost *:7777>

#    Routing to REST service
     ProxyPass /services/parts http://den00hdo.us.oracle.com:8080/services/rest/partsinventory/parts
     ProxyPassReverse /services/parts http://den00hdo.us.oracle.com:8080/services/rest/partsinventory/parts

</VirtualHost>

/services/parts is the resource for which an Authorization Policy with Identity Assertion checkbox marked must be created in OAM console.

 

REST Service Implementation

package oracle.ateam.sample.partsinvapi;

import javax.ws.rs.CookieParam;
import javax.ws.rs.GET;
import javax.ws.rs.HeaderParam;
import javax.ws.rs.Path;
import javax.ws.rs.Produces;

@Path("/partsinventory")
public class PartsInventory {
    public PartsInventory() {
        super();
    }

    @GET
    @Produces("application/json")
    @Path("parts")
    public String getParts(@HeaderParam("OAM_IDENTITY_ASSERTION") String identityAssertion) {
        
        System.out.println("** DEBUG: PartsInventory web service: Request header OAM_IDENTITY_ASSERTION: " + identityAssertion);
        
        try {
            String result = "{\"result\":"; 
            result += "[";
            result += "{\"uniqueid\" : \"123\", \"name\" : \"ABC\", \"desc\" : \"This is part ABC\", \"price\" : \"100.00\"},";
            result += "{\"uniqueid\" : \"456\", \"name\" : \"DEF\", \"desc\" : \"This is part DEF\", \"price\" : \"200.00\"},";
            result += "{\"uniqueid\" : \"789\", \"name\" : \"GHI\", \"desc\" : \"This is part GHI\", \"price\" : \"300.00\"}";
            
            result += "]";
            result += "}";
            return result;
        }
        catch (Exception e) {
            e.printStackTrace();
            return "{\"error\":\"" + e.getMessage() + "\"}";
        }
    }
}

Conclusion

On this post, I’ve demonstrated how to use a sometimes unnoticed yet powerful feature of OAM for implementing identity propagation between a JavaScript-based client web application and REST services, a very common architectural pattern nowadays. So, if you’re already using OAM to secure your traditional server-side web applications, like JavaEE, Perl, PHP, etc, and want to embrace this new application model, you have a tool right at your fingertips to use, and with minimal implementation effort.

Extracting Data from Oracle Business Intelligence 12c Using the BI Publisher REST API

$
0
0

Introduction

This post details a method of extracting data from an Oracle Business Intelligence Enterprise Edition (OBIEE) environment that is integrated with Oracle Business Intelligence Publisher (BIP) 12c. The environment may either be Cloud-Based or On-Premise.

The method utilizes the BI Publisher REST API to extract data from a BIP report. It also uses BIP parameters to filter the result set.

It builds upon the A-Team post Using Oracle BI Publisher to Extract Data From Oracle Sales and ERP Clouds. That post uses SOAP web services to accomplish the same result.

Note: The BI Publisher REST API is a new feature in the 12c version and functions only when accessing a BIP 12c environment.

The steps below depict how to build the extract.

Utilize an existing BI Analysis and BI Publisher Report

This post uses the analysis, filter and BI Publisher report from the post Using Oracle BI Publisher to Extract Data From Oracle Sales and ERP Clouds. Note: This post uses a filter named Analysis rather than the one named level4.

Create a REST Request to Retrieve the BIP Report Definition

This step sends a REST request to retrieve information necessary to actually call the report. Specifically, the xml portion needed for the parameter.

This post uses the Postman API testing utility as noted in the References section at the end of this post.

1. Create a new Collection

The collection is created by clicking the icon shown below:

BIP_POSTMAN_COLLECTION

Note: Enter a name and save the collection.

2. Add URL

Add the URL with this format: http(s)://hostname/xmlpserver/services/rest/v1/reports/path/reportname

For example: http(s)://hostname/xmlpserver/services/rest/v1/reports/custom%2FBIP_DEMO_REPORT

Notes:

The catalog location of the first report is Shared Folders/Custom/BIP_DEMO_REPORT. The top level shared folder in the catalog, Shared Folders, is assumed. The starting point is the folder named below that i.e. Custom.

The URL must be HTML encoded to be sent over the internet. Any space character in the path is replaced with %20 and any slash character i.e. / is replaced with %2F.

The URL for this report is shown below:

BIP_POSTMAN_URL

3. Add Authorization Header

Click on Headers as shown in the figure above.

Enter a key of Authorization.

For the value use a Base64 encoded username and password prefixed with “Basic “. To obtain the encoding, this post uses the website at https://www.base64encode.org/

The username and password are shown below separated by a colon character. The encoded result is shown at the bottom.

Base64 Encode Username Password

The header looks like this. Note: the encoded username and password below is derived from valid credentials.

BIP_POSTMAN_AUTHORIZATION

4. Get Report Definition

Set the command to Get, and click Send. The response will return in JSON format as shown below:

Note: That the parameter name for this report is the prompt label, Analysis, prefixed with the text saw.param.

BIP_POSTMAN_reptDef

Create a REST Request to Run the BIP Report

This creates the request to extract the data.

1. Add an Additional Header

For the additional header, enter a key of Content-Type.

Enter a value of multipart/form-data; boundary=“Boundary_1_1153447573_1465550731355”Note: The boundary value entered here in the header is for usage in the body below. The boundary text may be any random text not used elsewhere in the request.

Change the command in the upper left to Post.

The two headers are shown below:

BIP_POSTMAN_RUN1

2. Create the Body

The Body tab is opened as shown in the figure above.

The structure of the body is shown below. Note: The boundary text specified in the header is the first and last line of the structure. All boundary lines must be prefixed by the “–” string. Additionally, the closing boundary line must also be suffixed with the “–” string.

BIP_POSTMAN_RUNBODY

The Content-Type: application/json line specifies the request format.

The Content-Disposition: form-data; name=“ReportRequest” line specifies that the text following the blank line are the non-default items and values to be used for the run.

The JSON request text specifies the cache is bypassed and the value “Audio” is passed to the prompt / parameter to filter the results.

3. Send the Request and Review Results

The results are shown below:

BIP_POSTMAN_RESULTS

The result section is separated by system-generated boundary lines.

The XML output is shown above the closing boundary line.

Usage of the REST Request

The REST API request to run a BIP report may now be used anywhere a REST API request can be issued.

An example of the REST API request used in a Curl statement is shown below. Curl is a command line tool for getting or sending files using REST syntax.

BIP_POSTMAN_CURL

Summary

This post details a simple method of extracting data from an OBIEE environment using BI Publisher 12c and the BI Publisher Rest API.

For more BICS and BI best practices, tips, tricks, and guidance that the A-Team members gain from real-world experiences working with customers and partners, visit Oracle A-Team Chronicles for BICS.

References

API Testing POSTMAN Download

API Testing using POSTMAN

REST API for Oracle BI Publisher

Get Started with Analyses and Dashboards

Report Designer’s Guide for Oracle Business Intelligence Publisher

Loading Data into Oracle BI Cloud Service using BI Publisher Reports and REST Web Services

$
0
0

Introduction

This post details a method of loading data that has been extracted from Oracle Business Intelligence Publisher (BIP) into the Oracle Business Intelligence Cloud Service (BICS). The BIP instance may either be Cloud-Based or On-Premise.

It builds upon the A-Team post Extracting Data from Oracle Business Intelligence 12c Using the BI Publisher REST API. This post uses REST web services to extract data from an XML-formatted BIP report.

The method uses the PL/SQL language to wrap the REST extract, XML parsing commands, and database table operations. It produces a BICS staging table which can then be transformed into star-schema object(s) for use in modeling.  The transformation processes and modeling are not discussed in this post.

Additional detailed information, including the complete text of the procedure described, is included in the References section at the end of the post.

Rationale for using PL/SQL

PL/SQL is the only procedural tool that runs on the BICS / Database Schema Service platform. Other wrapping methods e.g. Java, ETL tools, etc. require a platform outside of BICS to run on.

PL/SQL can utilize native SQL commands to operate on the BICS tables. Other methods require the use of the BICS REST API.

Note: PL/SQL is a very good at showcasing functionality. However, it tends to become prohibitively resource intensive when deploying in an enterprise production environment.

For the best enterprise deployment, an ETL tool such as Oracle Data Integrator (ODI) should be used to meet these requirements and more:

* Security

* Logging and Error Handling

* Parallel Processing – Performance

* Scheduling

* Code Re-usability and Maintenance

The steps below depict how to load a BICS table.

About the BIP Report

The report used in this post is named BIP_DEMO_REPORT and is stored in a folder named Shared Folders/custom as shown below: BIP Report Location

The report is based on a simple analysis with three columns and output as shown below:

BIP Demo Analysis

Note: The method used here requires all column values in the BIP report to be NOT NULL for two reasons:

* The XPATH parsing command signals either the end of a row or the end of the data when a null result is returned.

* All columns being NOT NULL ensures that the result set is dense and not sparse. A dense result set ensures that each column is represented in each row.

Additional information regarding dense and sparse result sets may be found in the Oracle document Database PL/SQL Language Reference.

One way to ensure a column is not null is to use the IFNull function in the analysis column definition as shown below:

BIP IFNULL Column Def

Call the BIP Report

The REST API request used here is similar to the one detailed in Extracting Data from Oracle Business Intelligence 12c Using the BI Publisher REST API. The REST API request should be constructed and tested using a REST API testing tool e.g. Postman

This step uses the APEX_WEB_SERVICE package to issue the REST API request and return the result in a CLOB variable. The key inputs to the package call are:

* The URL for the report request service

* Two request readers to be sent for authorization and content.

* The REST body the report request service expects.

* An optional proxy override

An example URL is below:

http://hostname/xmlpserver/services/rest/v1/reports/custom%2FBIP_DEMO_REPORT/run

Note: Any ASCII special characters used in a value within a URL, as opposed to syntax, needs to be referenced using its ASCII code prefixed by a % sign. In the example above, the slash (/) character is legal in the syntax but not for the value of the report location. Thus the report location, “custom/BIP_DEMO_REPORT” must be shown as custom%2FBIP_DEMO_REPORT where 2F is the ASCII code for a slash character.

An example request Authorization header is below.

apex_web_service.g_request_headers(1).name := ‘Authorization’;          apex_web_service.g_request_headers(1).value :=  ‘Basic cHJvZG5leTpBZG1pbjEyMw==‘;

Note: The authorization header value is the string ‘Basic ‘ concatenated with a Base64 encoded representation of a username and password separated by a colon e.g.  username:password

Encoding of the Base64 result should first be tested with a Base64 encoding tool e.g. base64encode.org

An example of the Content-Type header is below:

apex_web_service.g_request_headers(2).name := Content-Type’;            apex_web_service.g_request_headers(2).value := ‘multipart/form-data; boundary=”Boundary_1_1153447573_1465550731355“‘;

Note: The boundary value entered here in the header is for usage in the body below. The boundary text may be any random text not used elsewhere in the request.

An example of a report request body is below:

Boundary_1_1153447573_1465550731355                                                                 Content-Type: application/json                                                                              Content-Disposition: form-data; name=ReportRequest”                        {“byPassCache”:true,”flattenXML”:false}                                         —Boundary_1_1153447573_1465550731355

An example proxy override is below:

www-proxy.us.oracle.com

 An example REST API call:

f_report_clob  := apex_web_service.make_rest_request ( p_url => p_report_url, p_body => l_body,        p_http_method => ‘POST’,  p_proxy_override => l_proxy_override );

Parse the BIP REST Result

The BIP REST result is the report XML data embedded in text with form-data boundaries.

This step uses the :

* INSTR function to determine the beginning and end of the embedded XML

* SUBSTR function to extract just the embedded XML and store it in a CLOB variable

* XMLTYPE.createXML function to convert and return the XML.

The key inputs to this step are:

* The CLOB returned from BIP REST call above

* The XML root name returned from the BIP report, e.g. DATA_DS

An example of the REST result returned is below:

–Boundary_2_1430729833_1479236681852

Content-Type: application/json

Content-Disposition: form-data; name=”ReportResponse”

{“reportContentType”:”text/xml”}

–Boundary_2_1430729833_1479236681852

Content-Type: application/octet-stream

Content-Disposition: form-data; filename=”xmlp2414756005405263619tmp”; modification-date=”Tue, 15 Nov 2016 19:04:41 GMT”; size=1242; name=”ReportOutput”

<?xml version=”1.0″ encoding=”UTF-8″?>

<!–Generated by Oracle BI Publisher 12.2.1.1.0 -Dataengine, datamodel:_custom_BIP_DEMO_MODEL_xdm –>

<DATA_DS><SAW.PARAM.ANALYSIS></SAW.PARAM.ANALYSIS>

<G_1>

<COLUMN0>Accessories</COLUMN0><COLUMN1>5161697.87</COLUMN1><COLUMN2>483715</COLUMN2>

</G_1>

<G_1>

         <COLUMN0>Smart Phones</COLUMN0><COLUMN1>6773120.36</COLUMN1><COLUMN2>633211</COLUMN2>

</G_1>

</DATA_DS>

–Boundary_2_1430729833_1479236681852– >

Examples of the string functions to retrieve and convert just the XML are below. The f_report_clob variable contains the result of the REST call. The p_root_name variable contains the BIP report specific XML rootName.

To find the starting position of the XML, the INSTR function searches for the opening tag consisting of the root name prefixed with a ‘<’ character, e.g. <DATA_DS:

f_start_position := instr ( f_report_clob, ‘<‘ || p_root_name );

To find the length of the XML, the INSTR function searches for the position of the closing tag consisting of the root name prefixed with a ‘</’ characters, e.g. </DATA_DS, determines and adds the length of the closing tab using the LENGTH function, and subtracts the starting position:

f_xml_length := instr ( f_report_clob, ‘</’ || p_root_name ) + length( ‘</’ || p_root_name || ‘>’) f_start_position ;

To extract the XML and store it in a CLOB variable, the SUBSTR function uses the starting position and the length of the XML:

f_xml_clob := substr(f_report_clob, f_start_position, f_xml_length );

To convert the CLOB into an XMLTYPE variable:

f_xml := XMLTYPE.createXML( f_xml_clob );

Create a BICS Table

This step uses a SQL command to create a simple staging table that has 20 identical varchar2 columns. These columns may be transformed into number and date data types in a future transformation exercise that is not covered in this post.

A When Others exception block allows the procedure to proceed if an error occurs because the table already exists.

A shortened example of the create table statement is below:

execute immediate ‘create table staging_table ( c01 varchar2(2048), … , c20 varchar2(2048)  )’;

Load the BICS Table

This step uses SQL commands to truncate the staging table and insert rows from the BIP report XML content.

The XML content is parsed using an XPATH command inside two LOOP commands.

The first loop processes the rows by incrementing a subscript.  It exits when the first column of a new row returns a null value.  The second loop processes the columns within a row by incrementing a subscript. It exits when a column within the row returns a null value.

The following XPATH examples are for a data set that contains 11 rows and 3 columns per row:

//G_1[2]/*[1]/text()          — Returns the value of the first column of the second row

//G_1[2]/*[4]/text()          — Returns a null value for the 4th column signaling the end of the row

//G_1[12]/*[1]/text()        — Returns a null value for the first column of a new row signaling the end of the — data set

After each row is parsed, it is inserted into the BICS staging table.

An image of the staging table result is shown below:

BIP Table Output

Summary

This post detailed a method of loading data that has been extracted from Oracle Business Intelligence Publisher (BIP) into the Oracle Business Intelligence Cloud Service (BICS).

Data was extracted and parsed from an XML-formatted BIP report using REST web services wrapped in the Oracle PL/SQL APEX_WEB_SERVICE package.

A BICS staging table was created and populated. This table can then be transformed into star-schema objects for use in modeling.

For more BICS and BI best practices, tips, tricks, and guidance that the A-Team members gain from real-world experiences working with customers and partners, visit Oracle A-Team Chronicles for BICS.

References

Complete Text of Procedure Described

Extracting Data from Oracle Business Intelligence 12c Using the BI Publisher REST API

Database PL/SQL Language Reference

Reference Guide for the APEX_WEB_SERVICE

REST API Testing Tool

XPATH Testing Tool

Base64 decoding and encoding Testing Tool

 

 

Viewing all 36 articles
Browse latest View live