Quantcast
Channel: iPaaS@ORACLE.CLOUD
Viewing all 875 articles
Browse latest View live

#660 - OIC --> using the OPA adapter

$
0
0
OPA = Oracle Policy Automation



Oracle Policy Automation is an end-to-end solution for capturing, managing, and deploying complex legislation and other document-based policies across channels and processes. Oracle Policy Automation is an end-to-end solution for capturing, managing, and deploying complex legislation and other document-based policies across channels and processes.

One creates process models in OPA - these contain the business logic - from the docs -
Oracle Policy Modeling is a desktop application used to build interactive interviews based on business requirements, complex policy and legislation. It can be used for modeling:
  • Policy eligibility
    • For example, determining whether a contact is eligible for a benefit, permit, loan, discount, license, product upgrade or parental leave.
  • Calculations
    • For example, determining a contact’s rate of benefit, allowance, deduction, tax, discount, waiting period or follow-up.

Essentially we will pass in a payload to OPA and it will process it via the policy model I select, then return a result.

As an OPA neophyte, I have worked out that it has two main parts -

1. the Policy Modeling Tool - which you download to a Windows machine.
This is where you design the policies.

2. OPA Hub - for policy model management and deployment.









A colleague has kindly created a simple policy model for me -



As you can see, he has checked Web Services - Assess - so what is that?

From the docs -

The generic assess service uses supplied data to determine one or more outcomes, 
can work out what additional data is needed to reach a conclusion, 
and provides reports on how decisions were reached.

OIC_test will simply assess an incoming purchase order and return an approval value -
true or false.

Creating the OIC Integration with OPA

I create the following integration in OIC -





































Test is the Invoke of OPA.

The OPA Connection is defined as follows -

































The Access Token URI = OPA Hub URL/opa-hub/api/auth

Note the OAuth requirements - for this, some preparatory steps in OPA are required -























Full setup info -

OPA Adapter Guide here
Supplementary OPA doc for Integration here
OPA REST API here


Now to the mapping -












Let's look at it in detail -



















OPA processes cases, think of a social welfare entitlement case and the complex rules
that need to be applied to it.
As already explained, my simple example processes a purchase order case.
Note the outcomes field - This structure contains the case field(s) which will hold the response from OPA. In my simple example, this is the order_approved field. Again, as already mentioned, OPA will return a value of true or false.
The orderId, order_date and order_value fields are all related to the purchase order I will be validating.

The final field, id, is the case unique id field and needs to be set. I just set it to the orderId.

Now to the response mapping -
















As you can see, I map the order_approved field to my result field.

Now to testing, via Postman -




















Very succinct!


#661 First steps with Oracle Cloud Infrastructure (OCI) - Users, Roles, Policies

$
0
0















When you subscribe to OCI you get a default administrator account.
So your first task may be to create other users and assign permissions to them.

Users can be natural persons or applications.

But first to some OCI concepts -

1. Regions - OCI is hosted in regions - different physical locations in the world e.g. Phoenix US, London UK, Frankfurt DE.

2. Availability Domains (AD) - Within a Region I have Availability Domains, which are isolcated from each other, thus giving me High Availability OOTB. Some OCI resources you create, e.g. storage volumes are AD specific.

3. Tenancy - essentially your account - your slice of OCI.

4. Compartments - containers you can define within your tenancy so you can organise and isolate the resources you create. For example, a large organization could assign different compartments to departments etc. Compartments are logical as opposed to the physical Regions and ADs. Resources can be shared across compartments.


Users, Roles and Policies

So let's try this out...











































Now I create a Compartment -


















Next comes a Policy - it will give my group permissions within the compartment
















The policy I create is as follows -
Allow group niallcOCI-usersGroup to manage all-resources in compartment niallCCompartment






































The Policy is created -













Now to create a User -
























I create a temp password for the user -













I now login as the new user -




















I am prompted to change my password -



















I go to Compute - Instances













I cannot select the Compartment - as I have not been granted that role



















I logout and log back in as the admin.
I edit the user -































I log out and then back in as the new user -
I again go to Compute - Instances

Now I can select the Compartment







































#662 OCI - Virtual Cloud Networks (VCNs) and Compute Instance Creation

$
0
0


























According to the ORCL docs -

A Virtual Cloud Network is a virtual version of a traditional network—including subnets,
route tables, and gateways—on which your instances run. A cloud network resides
within a single region but can cross multiple Availability Domains. A VCN covers a single, contiguous IPv4 CIDR block of your choice.

Some more concepts/definitions -

1. Subnets - Subdivisions you define in a VCN (for example, 10.0.0.0/24 and 10.0.1.0/24). Subnets contain virtual network interface cards (VNICs), which attach to instances. Each subnet exists in a single availability domain  and consists of a contiguous range of IP addresses that do not overlap with other subnets in the VCN

2. VNICs - A virtual network interface card (VNIC), which attaches to an instance and resides in a subnet to enable a connection to the subnet's VCN. The VNIC determines how the instance connects with endpoints inside and outside the VCN. Each instance has a primary VNIC that's created during instance launch and cannot be removed. 

3. Private IP - A private IP address and related information for addressing an instance (for example, a hostname for DNS). Each VNIC has a primary private IP, and you can add and remove secondary private IPs. The primary private IP address on an instance doesn't change during the instance's lifetime and cannot be removed from the instance.

4. Public IP - A public IP address and related information. You can optionally assign a public IP to your instances or other resources that have a private IP. Public IPs can be either ephemeral or reserved.

5. Internet Gateway - An optional virtual router that you can add to your VCN. It provides a path for network traffic between your VCN and the internet. 

6. Routing Tables - Virtual route tables for your VCN. Your VCN comes with a default route table, and you can add more. These route tables provide mapping for the traffic from subnets via gateways or specially configured instances to destinations outside the VCN.

7. Security Lists - Virtual firewall rules for your VCN. Your VCN comes with a default security list, and you can add more. These security lists provide ingress and egress rules that specify the types of traffic allowed in and out of the instances. 


Check out the full docs here

So let's go and create a VCN -















































3 Subnets have also been created.






















One subnet per Activity Domain.

This can be seen, when I then go to create a Compute Instance -


























The relevant subnet is selected, based on AD.





















Other resources have also been created -




























Now back to the Compute Instance creation...
























I can then login -


























#663 OCI - Compute Service in some more detail

$
0
0

Create Compute Instance




























Please note: All text in italics is from the Oracle docs

Check out the shape types above  - Virtual Machine or Bare Metal

Bare Metal

A bare metal compute instance gives you dedicated physical server access for highest performance and strong isolation.

Essentially a single-tenant model - no noisy neighbours. You also get direct hardware access.
So who should go for this? For example, customers with performance intensive requirements.

Virtual Machine

A Virtual Machine (VM) is an independent computing environment that runs on top of physical bare metal hardware. The virtualization makes it possible to run multiple VMs that are isolated from each other. VMs are ideal for running applications that do not require the performance and resources (CPU, memory, network bandwidth, storage) of an entire physical machine.

Details of the available shapes for both options are here

Bare Metal includes GPU based shapes, optimal for high performance and machine learning.

The VM shapes support the following operating systems -


















BYOI is also supported for older versions of a particular O/S etc.

Here I am creating an Oracle Linux based VM -





















Boot Volumes

When you launch a virtual machine (VM) or bare metal instance based on an Oracle-provided image or custom image, a new boot volume for the instance is created in the same compartment. That boot volume is associated with that instance until you terminate the instance. When you terminate the instance, you can preserve the boot volume and its data.

All boot volumes are encrypted and ensure faster boot times and enable compute instance scaling.
You can create a custom image based on a boot volume and then select that when creating a new instance. Backups/cloning is also supported.

Block Volumes


The Oracle Cloud Infrastructure Block Volume service lets you dynamically provision and manage  block storage volumes . You can create, attach, connect and move volumes as needed to meet your storage and application requirements. Once attached and connected to an instance, you can use a volume like a regular hard drive. Volumes can also be disconnected and attached to another instance without the loss of data.











































Regarding the Backup Policy -
gold - daily
silver - weekly
bronze - monthly

















Now I can attach it to my compute instance -








































iSCSI: A TCP/IP-based standard used for communication between a volume and attached instance.
Paravirtualized: A virtualized attachment available for VMs.







The iSCSI commands are available here -










































I ssh into the vm and execute the register cmd -






Note the positive response above.

I now execute the cmd to automatically re-connect after reboot -





I do sudo su - and login to iSCSI -






fdisk -l
















note the root volume sda and the new volume sdb

I format it -mkfs.ext3















and then mount it - mount /dev/sdb /mnt/hotd













Edit /etc/fstab to mount automatically -

Firstly, I need the UUID for sdb - blkid







Format of entry in fstab is as follows -
UUID=699a776a-3d8d-4c88-8f46-209101f318b6 /mnt/vol1 xfs defaults,_netdev,nofail 0 2
In my case -












File Systems

Oracle Cloud Infrastructure File Storage service provides a durable, scalable, distributed, enterprise-grade network file system. You can connect to a File Storage service file system from any bare metal, virtual machine, or container instance in your Virtual Cloud Network (VCN). You can also access a file system from outside the VCN using Oracle Cloud Infrastructure FastConnect and Internet Protocol security (IPSec) virtual private network (VPN).































































Now to mounting the file system I just created - again, all I need to do is to copy and paste the commands -























#664 OCI - Load Balancer

$
0
0



















The Oracle Cloud Infrastructure Load Balancing service provides automated traffic distribution from one entry point to multiple servers reachable from your virtual cloud network (VCN). The service offers a load balancer with your choice of a public or private IP address, and provisioned bandwidth.


Public Load Balancer 

- accepts traffic from the Internet.

A public load balancer is regional in scope and requires two subnets, each in a separate availability domain. One subnet hosts the primary load balancer and the other hosts a standby load balancer to ensure accessibility even during an availability domain outage. 

A public load balancer consumes two private IP addresses, one from each host subnet.

Private Load Balancer

To isolate your load balancer from the internet and simplify your security posture, you can create a private load balancer. The Load Balancing service assigns it a private IP address that serves as the entry point for incoming traffic.

So what do I need to try this out?

Yes - 2 compute instances in different ADs.
These are my backend servers


















I also need an Internet Gateway - one was created by default when I created my VCN.












I then install an Apache server on both Compute instances























I open the firewall for ports 80/443 on both instances -





Then reload the firewall -
firewall-cmd --reload
I start the http server on both machines -
systemctl start httpd
I then add a different index file for each - NiallC1/NiallC2
echo 'NiallC1'>/var/www/html/index.html

Your load balancer must reside in different subnets from your application instances. This configuration allows you to keep your application instances secured in subnets with stricter access rules, while allowing public internet traffic to the load balancer in the public subnets.

So let's create those 2 subnets -

4 tasks here -
1. Add a Security List

























Note: I have deleted the default ingress and egress rules.

2. Add a Routing Table - makes sense!

























3. Create SubNet1


I called it LB-Subnet1
























4. Same procedure for LB-Subnet2























Essentially - this is what I have created









Create a Load Balancer































































We are now here -



















A load balancer directs traffic to what's known as a Backend Set, e.g. my 2 Compute Instances.

I now create this backend set -







































































Now I add the 2 instances to the backend set -

For this I need the OCID of each -

























It's back to editing the Backend Set now -


































































We are now here -




















Now I need to create a Listener for the Load Balancer -

A listener is an entity that checks for connection requests. The load balancer listener listens for ingress client traffic using the port you specify within the listener and the load balancer's public IP.

I want to accept http requests on port 80.

































































































When you create a listener, you must also update your VCN's security list to allow traffic to that listener.













































Now to see if all is working -


THE BIG TEST


I open a browser and enter the public ip of my load balancer









I refresh the page -












This blog post is based 100% on a great lab from the Oracle OCI team.
It is available here


Well done folks!



#665 OIC pre-builts on the Oracle Marketplace

$
0
0














Now there is a dedicated OIC page on the Oracle Marketplace.
To date, there are ca. 40 pre-built integrations available for download and use.

Just click here

#666 Managing OIC - Instance Management - Users & Roles

$
0
0























I see the above when I log into my account at cloud.oracle.com
I click on Autonomous Integration













In the above, I can manage existing OIC instances and provision new ones.

Let's look at Management first -

OIC Instance Management
























The first menu option takes me to the OIC home page, when I can create integrations, processes etc.

Start/Stop -

Stop does the following -
-Integration endpoints are quiesced.
-Process instances are quiesced.
-Runtime is quiesced.
-Scheduled integrations do not execute.
-Database purging continues to run.
-REST APIs are available for use.
-Design time is available for use.

Start - starts all of the quiesced above.


Add Tags - tags can be used for categorising and searching your instances.

Scale Instance - from the docs -

You can increase or decrease the number of message packs for your instance based
on the demand. The maximum number of message packs for an instance is based on
your license type. For a bring your own license type (BYOL), the instance can have a
maximum of 3 message packs and each message pack adds 20K Messages per Hour
Pack to your instance. If you don’t have a BYOL license type, the instance can have a
maximum of 12 message packs and each message pack adds 5K Messages per Hour
Pack to your instance.

So what happens if you need to cater for more than 60k msgs per hour?
You simply add another OIC instance.











Remember, you are charged hourly, based on the number of 5K msg packs you specify.
So scaling here will directly affect your charges.

Change License Type - enables BYOL












Delete - scraps the OIC instance


OIC Provisioning 


Very simple -

























Note: I have selected the BYOL option above













If I select the other license type -














That's it!

Read a couple of pages of The Hare of the Dog, available from Amazon, and then your instance is ready for use.


Provisioning users and assigning OIC Roles



















So what do users, with these roles, see when they login to OIC?

Let's create one -











I define the user in ID CS -



















I assign the roles to that user, back in the user management of OIC -

























ServiceUser Role


Let's assign the role ServiceUser to Nuada

















I now login to OIC as Nuada -















Note - I only see the Integrations IDE link.

But I have limited permissions within Integration.

I have View Only access to Integrations -












I cannot activate an integration -











I do, however, see all of the monitoring dashboards -





















ServiceDeployer Role

I now assign the above role to Nuada.






















I login to OIC and see -















I click on the Integrations link











So the ServiceDeployer has no permissions whatsoever.
It is not applicable to Integration.


ServiceMonitor Role


























Nuada can now only see the Monitoring page(s).




























ServiceDeveloper Role
























I login to OIC -

















and have full functionality.

#667 OIC --> Visual Builder Cloud Service --> Business Objects

$
0
0
VB CS, as you all probably know, enables one to create, publish and host mobile and web apps.

One uses Business Objects to define business data. These Business Objects and their data are then persisted in the underlying VB CS persistence store.
Access to the data, in my case below, Organizations, is via REST Services.
The REST APIs are used by the VB CS apps you create,
however, you can leverage them from without as well.

Recently, I had the opportunity to delve a bit deeper into Business Objects -

























Business Rules

I hope the basic idea of Business Objects is clear from my previous posts on VB CS.
Essentially, I can define a business object then auto-magically generate Create, Retrieve, Update and Delete forms for the aforementioned.







































Now to some of the extra value-add - The CountryCode field above is
defined as ReadOnly.





















So where did the above CountryCodes come from?






























Ok, the above is rather simplistic, but one can do much more complex processing -
























Endpoints




































Test in Postman -




















Now to the POST Endpoint -

























Role Based Security

Fine grained role based security is also available -























Import/Export Data

I can pre-populate my business objects -









































I can also export in .csv format -











#668 OIC --> VB CS --> Service Connections. Triiggering an Integration from VB CS

$
0
0
From the docs -

When you want to expose business objects from an external source in your visual application, you can add and manage connections to sources in the Service Connections pane of the Artifact Browser.

Ok, so this allows us to bring external functionality into VB CS.

The previous post showed my Organization Business Object.
I also have an Integration that creates Organizations in Service Cloud.
This Integration is exposed via REST -

https://myOIC/ic/api/integration/v1/flows/rest/AA_CREATESERVICEORG/1.0/createOrganization





























































Add Authentication -





















add the body -















































As you can see, The Hare of the Dog Public House already exists.


Now I can leverage this Service Connection in the app from the previous post -

I have added an Edit page to the app -

I will leverage the Service Connection from this page - but first I add a new field to my Organization Business Object -



















Here is the Edit page -

















I add a new Button - Create in Service Cloud




















I create a new Event -

















I select Call REST Endpoint -


















I select the Service Connection -



















Assign the Organization fields to the request body -






















Test -















Monitor the Integration -



























#669 - OIC --> VB CS Service Catalog

$
0
0
Here is a simple example of accessing the ERP service catalog via VB CS.









Open the context sensitive menu on the right and select Settings -







































Note the Catalog URL format for the REST APIs-

https://YourERP/fscmRestApi/otherResources/latest/interfaceCatalogs

Test -








Catalog is found, so let's leverage it in our VB CS App -

I begin by creating a new Service Connection -









Select from Catalog -














Click Sales and Service -











the objects are displayed - I need to select an object and then the relevant endpoints















I select contacts -





















Note the 114 endpoints - good reason for going thru Integration and it's business friendly adapters!
For this simple example, I just select GET contacts
































I select the GET /contacts api -

























I click Test -
























Great stuff!
Next step would be to simply leverage this in my app -
























#670 How to migrate from ICS to OIC --> A Team blog post

#671 OIC 18.4.5 New Features - UI Path RPA adapter

$
0
0
An overview of this new feature that comes with the December 2018 release -

UI Path RPA Adapter











Earlier this year I blogged on how to leverage UI Path Robots from OIC.
You can read the post here

Essentially, I had to make ca. 6 REST adapter invokes to get the robot to go the final mile for me.
It was doable, albeit somewhat pedestrian. Now this has all been encapsulated in the UI Path RPA adapter.

In this example - based on an excellent lab from my esteemed colleague Chris P.- I will integrate via the Robot with a mainframe system, courtesy of SIM390 -













In the demo - I will update the TICKETS file, listed above.

My RPA process is as follows - (Process being what the Robot executes)




















Essentially, I am logging on to the Mainframe and updating the TICKETS file.

This process has been deployed to the UI Path Orchestrator -














The Robot, running on my laptop, has access to it -




















Now to OIC -

Create the RPA Connection


I create a new UIPATH RPA connection -














Only Invoke is supported at adapter level. 
You can, of course, trigger an OIC flow from UI Path via other means.

Adapter Configuration -













I add my security credentials for the UI Path Orchestrator and then test -
























Looks good!

Create the Integration


















I use a REST trigger -


































Note the name of the request variable - ticketMessage.

I add the UI Path connection as an Invoke -




















I configure as follows -
















I select my RPA Process - Mainframe and configure as follows -



















I do the required mapping -












and the final response mapping -




Now to testing -


I use Postman -














#672 OIC 18.4.5 New Features - Process

$
0
0
Just covering some of the new features -
full list is here

Simple Forms Editor


18.4.5. brings us a simple form editor in Process -





































I click the +























































Streamlined Navigation for Processes and Decisions 













Dynamic process iFrame support


Check out the  UI Process components at https://yourOIC/ic/pub/components




























Click on the Cookbook link -





and see what is available -
























#673 - OIC Integrations - re-submittable errors

$
0
0


















This is an area customers often ask me about.
Firstly, when is an error re-submittable?

According to the docs -

You can manually resubmit failed messages. Resubmitting a failed message starts the
integration from the beginning.
All faulted instances in asynchronous flows in Oracle Autonomous Integration Cloud
are recoverable and can be resubmitted. Synchronous flows cannot be resubmitted.

Next question is - who do I know if my flow is asynchronous?

Again back to the docs -

Oracle Autonomous Integration Cloud does not currently allow modeling an
asynchronous request response service. However, all scheduled orchestration
patterns internally use an asynchronous request response.

However, OIC does support Fire and Forget (one-way requests).

Now you ask - how can I create a Fire and Forget interface for my OIC Integration flow?

This is relatively simple -

I use a SOAP trigger, based on a one-way wsdl.

The one way wsdl, as the name strongly suggests, does not include a response.

  Note: the portType - CreateOrgService - definition only includes input.

I then create a new SOAP Connection in OIC, based on this wsdl.

That Connection, I then use as a Trigger in the following integration -




































Very minimalistic - however, suffice to know, I just invoke Service Cloud to create a new Organization.

I have guaranteed an error, by mapping the lookupName. This is not allowed, as this field in auto-generated by Service Cloud.



















I deploy and test, which brings me back to the first screenshot in this post -






























Resubmitting Errors

I have a new integration flow with my one-way SOAP Trigger.
This integration simply writes new organization data to a file,
via the OIC Connectivity Agent.

The destination folder is on a thumb drive, plugged into my laptop.











As you can see, I already have 1 output file there.

I now remove the thumb drive and test the integration again -















The flow throws an error -




























The above error message is self explanatory.

I now re-insert the thumb drive








I re-submit the error via OIC Monitoring console
























I check out the folder - voilà.














#674 OIC DB Adapter for Oracle Database 18c

$
0
0


I just installed the 18c DB on my laptop, now to leverage it from OIC -


Create DB Connection
























Here is the DB connection configuration -
























I specify my Connectivity Agent group -

















Leverage the DB Connection in an Integration

I created an integration with a REST Trigger -
Payload is custId and customerName.

I then invoke the DB connection -


I select the Customers table -














and then Import -

















My table does not have a primary key defined, so the following appears in the wizard -
















I do the mappings -














I test via Postman -





















I check the DB -










DB Select Example 

















Note the parameter definition -

select cust_name from customers where cust_id = #custId













Map the response -


















Test via Postman -


































































































#675 OIC Integrations import via REST API

$
0
0
simple example of using the REST API via curl

The first command checks if the integration already exists

to check if the Integration already exists- 

I am using the demo Hello World, shipped with OIC.

curl -k -u yourOICuser:Pwd  -i -X GET -H Content-Type: application/json https://yourOICInstance/ic/api/integration/v1/integrations/HELLO_WORLD%7C01.02.0000


to import a net new integration -

curl -X POST --insecure -u yourOICuser:Pwd -F file=@"AA_SERVICECLOUDNEWO_01.00.0000.iar" https://yourOICInstance/ic/api/integration/v1/integrations/archive

you can, of course, do the ssl setup and drop the --insecure directive.



#676 Integration@OOW London 16/17th January 2019

$
0
0
I am really looking forward to networking with partners and customers at next week's OOW in London.

For those attending, I can highly recommend the following session -



Maybe see some of you there!


















#677 Consume HCM Atom Feeds in Oracle Integration

$
0
0













Dedicated to those of you interested in integrating with Oracle HCM -


So what are Atom Feeds?

HCM Atom feeds generate notifications on HCM events. 
For example, a new employee is hired - this causes a corresponding Atom Feed to be sent to the Atom server. The feed contains metadata and a pointer to the REST resource which can be used for retrieving the employee. Oracle Integration can subscribe to these Atom feeds.

So which HCM events are supported?

According to the docs -

Oracle HCM Cloud currently supports creation of Atom feeds for the emps and workstructures resources in the following scenarios:

When an employee is hired, updated, or terminated
When an assignment is created, updated, or end dated

When any of the following workstructures is created, updated, or end dated:
Grades
Jobs
Locations
Organizations
Positions

So now to a simple example -

Here is my HCM connection definition in Oracle Integration, yes we are dropping the C.
















Here is the first part of the integration -





































This is a scheduled integration that gets new hire information from the HCM Atom Server.

The HCM adapter invoke is configured as follows -












































Note the checkbox at the bottom -

Select this checkbox to send an HTTP request
for each entry in the feed to the ATOM server
to fetch the business object snapshot.

Ok so now we have the atom feeds - next steps are

















for each new hire ...








get the employee details and write them to a file via ftp server invoke.

Let's look at getEmployeeDetails -

Here we just use the REST adapter to call the following -
























The Atom Feed contains the emp unique id, so we can call the resource
Get an Employee - /emps/{empsUniqID}

As you see, we do this in within a For Each Loop.















Now to the mapping from the atom feed to the REST query parameter -




















I map the PersonId to the parameter.

BTW. The lab is courtesy of my HCM colleague Peter S.
Thanks Peter!

Now back to that checkbox I mentioned -







What happens when I check this, in other words, what does
the atom feed response look like?





















Note the Emps above EmployeeNewHireFeedWithBO_Context -





















#678 Integration Cloud / HCM Integration & HCM adapter in depth

$
0
0
I have been looking at integrating with Oracle HCM recently.
As a neophyte in respect of HCM, I first began by looking at what it actually contains -

Here are the pillars




















Here are the modules within those pillars














Global HR is the core solution.

From the docs -
A unified HR system in the cloud lets you eliminate disparate systems and align your HR processes and reporting worldwide—ensuring HR process and data consistency. To ensure efficient local operations, features such as entire system translations, data protection support, local business rules, country payrolls, and compliance reporting are crucial. Highly configurable processes are also needed to simplify quick adaptation to legislative and organizational changes—without causing disruption to your business. 


Inbound Integration support within Oracle HCM


Oracle HCM comes with a plethora of pre-built integration capabilities e.g.
HCM Data Loader 

HCM Data Loader is a powerful tool for bulk-loading and maintaining data. The data can be from any source. You can use HCM Data Loader for data migration, ongoing maintenance of HCM data, and coexistence scenarios, where core HR data is uploaded regularly. 

You can also load data into HCM via Spreadsheet Loader -

HCM Spreadsheet Data Loader is a generic data loader that you can use to load most objects that HCM Data Loader supports. The exceptions are components, such as Document Record Attachment and Person Image, that don't provide user keys. Objects are loaded from the spreadsheet to the HCM Data Loader stage tables. HCM Spreadsheet Data Loader is available in the Data Exchange work area.

Finally, Payroll Batch Loader can be used to load payroll business objects into Oracle HCM Cloud.



Outbound Integration support within Oracle HCM

So how do we get data out of Oracle HCM?

Bulk Data Extract -

The main way to retrieve data in bulk from Oracle HCM Cloud is HCM Extracts, which is a tool for generating data files and reports.
HCM Extracts has a dedicated interface for specifying the records and attributes to be extracted. You:
  • Identify records for extraction using complex selection criteria.
  • Define data elements in an HCM extract using fast formula database items and rules.
You manage HCM Extracts either in the Data Exchange work area or using the Checklists interface in the Payroll work area. Alternatively, you can run extracts using the Flow Actions Service web service from outside Oracle HCM Cloud. This feature enables you to automate the outbound extract as part of an overall integration flow.
You can also leverage BI Publisher -

Oracle BI Publisher supports both scheduled and unplanned reporting, based on either predefined Oracle Transactional Business Intelligence analysis structures or your own data models. You can generate reports in various formats. To use Oracle BI Publisher for outbound integrations, you generate reports in a format suitable for automatic downstream processing, such as XML or CSV.



Oracle HCM Atom Feeds

read all about them in this blog post.


The above are discussed in detail in the following Oracle doc








Oracle HCM REST API

The most recent (19a) REST API documentation is here

The use cases covered by the REST api are detailed here

The following screenshot just lists some of them -









































Oracle HCM SOAP API


HCM support for SOAP Services is documented here


Now to Oracle Integration and how it supports us in integrating with Oracle HCM -

Oracle Integration HCM Adapter

The adapter supports the following -
1. REST APIs
2. Bulk Data Extract
3. HCM Data Loader
4. Atom Feeds

Let's look at these in a bit more depth -

REST APIs

The REST APIs are listed here

Let's look at employees -





























I run Get all employees in Postman and find Chris -

























Chris and I go back a long way, we used to drink together down in the Horse and Hound Public House in Harmonstown.

His PersonId is -








I can, of course, use that as follows -

/hcmRestApi/resources/latest/emps?q=PersonId=100000000276211




































So now, having looked at the ways in and out of HCM, 
let's look at how the Oracle HCM adapter supports us.



Oracle Integration HCM Adapter

The HCM Connection in Oracle Integration is configured with the following -









The HCM Services Catalog WSDL URL points to -

The service catalog service is 
a Fusion Application service that returns a list
of external services available for integration. It 
allows clients to retrieve information about all
public Fusion Application service endpoints 
available for that instance.


The Interface Catalog URL enables the use of REST resources.
Now that we have the connection - let's look at it in use -










































The wizard provides a business friendly interface to the HCM integration components I have already discussed.



Query, Create, Update or Delete Information:






























Let's check out SOAP based Business Objects -





The Target mapping is as follows -


























Let's look at SOAP based Services -






























The Target mapping is the same as above.

Now to the REST Services - I am switching to Emps



























The Mapping Target is this case is -











Remember my Postman example ?

/hcmRestApi/resources/latest/emps?q=PersonId=100000000276211

I essentially do the same here.


Extract Bulk Data using HCM Extracts:






















The HCM Extract must, of course, already exist -

I have access to one called - Encrypted New Hire Export

One can navigate to the Extracts in the HCM app as follows -

Main Menu - Data Exchange --



























































One can also check out the fields extracted etc.
























The resulting file of the extract is placed into the UCM (Universal Content Management) component of Oracle HCM.

This is from where Oracle Integration picks it up.
Ergo, the extract will have to have run at least once for us to pick it up.


So back to the HCM adapter -














The Mapping Target is as follows -
















The highlighted field has to be set. This is used to identify the last HCM extract processed.

Now to a simple integration that demos leveraging the HCM Extract in Oracle Integration.

I create a new scheduled integration -




















I add the HCM adapter -


































I now define a schedule parameter which will be used mapping.














I initialise as follows -







Now to the Mapping -








In this simple example, I will send an email to myself,
containing the Extract document id etc.


















Here is the result -








The next area I will discuss is HCM adapter support for Atom feeds -

This section is short and sweet as I have already documented this in the blog post here
















Finally - File Upload -

This is discussed in the following blog post.



























#679 Oracle Integration for HCM - File upload

$
0
0
Here is an example of doing the above -







































Now let's look at the actions here -

1. Schedule Trigger - no need to explain this.

2. ReadFile  from FTP -
here I read a file from my ftp server. This is a zip, containing the following .dat file in the following format -









The ReadFile is configured as follows -
















UploadFile to HCM is configured as follows -



















As you can see this is using the standard SOAP adapter to talk to UCM.

Now the mapping to UploadFile is interesting, to say the least.

Here is the source -

























Here is the target -





































Let's look at the 5 fields under Document -

The fields that the UCM SOAP service expects are described here -




































Full doc -click here

I set the following -
dDocTitle
dDocType
dSecurityGroup
dDocAccount
primaryFile


The 5 Target fields are created by repeating field1 4 times.
Each field has the following structure -







the values of which are set as follows -










and so on...

dDocType is set to "Document"

dSecurityGroup is set to "FAFusionImportExport" again, as shown in the screenshot from the UCM doc above.

dDocAccount is set to "hcm$/dataloader$/import$"

primaryFile is set to the ReadFtp Response file name
e.g. FileReadResponse/ns2:ICSFile/ns2:Properties/ns2:filename


What happens now that we have the file in UCM?

























We need to use the HCM adapter to import and load the data from UCM to HCM proper.


























Now to the mapping -

I set the Target ContentId to

$UploadHDLFileToUCM/
nsmpr1:GenericResponse/
nsmpr1:Service/nsmpr1:Document/nsmpr1:Field[@name='dDocName']


 and the Target Parameters to
"FileAction=Import_And_Load"











That's it!

Here is my input .zip, containing my .dat file -













I submit a run -















I check in HCM -

































Viewing all 875 articles
Browse latest View live