Quantcast
Channel: iPaaS@ORACLE.CLOUD
Viewing all 874 articles
Browse latest View live

#721 OIC - Salesforce Adapter - Events triggering Integrations

$
0
0
Today a simple demo of using the SFDC adapter as a trigger in an OIC integration.
In other words, automatically calling an OIC integration when a business event occurs.

Here an OIC integration will be triggered on the creation of new Irish leads in SFDC.





Below is what one sees on dropping the SFDC adapter as a Trigger in OIC -
Essentially instructions on completing the pre-requisites in SFDC to make this work.













SFDC calls OIC when an event occurs, e.g. New Irish Lead.
For this you have to configure some stuff in SFDC -

an Outbound Message - to call OIC - this will include specifying the OIC url and the object/fields you want to send e.g. Lead name, email etc.

a Workflow Rule - defining when to make the call e.g. on lead creation when the lead has an Irish address.


Create Outbound Message and Workflow Rule














I select Lead -






Note: I enter a dummy Endpoint URL.
This I will replace later with the concrete OIC url, once the integration has been activated.

I also specify the Lead fields to send -













































I save the outbound WSDL to a file for later use in the OIC Trigger configuration -

















I now create a Workflow Rule in SFDC -




































This rule will fire for new Leads with a country address = Ireland
(set in Rule Criteria)







I now bind this to the outbound message I previously created -



















Last step is to activate the rule -

























Now back to the Trigger configuration in OIC -

Create Integration in OIC

As you can see, I have uploaded the outbound WSDL file.













The above is just telling me to update SFDC Outbound Message definition with the correct URL,
once the integration has been activated.

In this case, I will just write the new leads to a file on an ftp server.

I start by setting the Tracking field -



















I implement as follows -




















Mapping Source, i.e. the data sent from SFDC, is as follows -

























I activate and update the SFDC workflow definition with the concrete integration URL.

I now test by adding a new Lead in SFDC -























I monitor in OIC -













I check my ftp server -









and there he is, the bauld Dermot.


#722 OIC - Salesforce Adapter - Perform Core or Utility Operations (ConvertLead)

$
0
0

















SFDC Core and Utility Services are documented here


















































The OIC SFDC adapter supports a subset of these -
















Convert Lead - Converts a Lead into an Account, Contact, or (optionally) an Opportunity.

Get Deleted - Retrieves the IDs of individual objects of the specified object that have been deleted since the specified time

Get Server Timestamp - Retrieves the current system timestamp (Coordinated Universal Time (UTC) time zone) from the API

Get Updated - Retrieves the list of individual records that have been updated (added or changed) within the given timespan for the specified object.
Use this for data replication applications to retrieve a set of IDs for objects of the specified object that have been created or updated within the specified timespan.

Get User Info - Retrieves personal information for the user associated with the current session, e.g. full name, email etc.

Merge - Combines up to 3 records of the same type into 1 record. The input is an array of MergeRequest elements, each of which specifies the records to combine e.g. merging contacts or leads.

Process - Submits an array of approval process instances for approval, or processes an array of approval process instances to be approved, rejected, or removed.

Send EmailImmediately sends an email message - counts against the daily email limit.

Undelete - Undeletes records from the Recycle Bin.
Details on the Recycle Bin here

Now to an example - 

Convert one of our Leads to an Account and Contact












Here is the OIC invoke configuration -




















Here is the mapping -

























The request includes the leadId of my contact, Jimmy.
convertedStatus has been hard-coded as follows -







The response mapping -















I have mapped accountId, contactId, success from Source to the Target fields.


I test via Postman -












I validate in SFDC -

Jimmy is now a Contact -















The Hare of the Dog Pub is now an Account -


























Note also that an Opportunity has been created for this account.



#723 OIC - Salesforce Adapter - Invoking APEX web services

$
0
0
APEX - a strongly typed, object-oriented programming language that allows developers to execute flow and transaction control statements on Salesforce servers in conjunction with calls to the API. Using syntax that looks like Java and acts like database stored procedures, Apex enables developers to add business logic to most system events, including button clicks, related record updates, and Visualforce pages. Apex code can be initiated by Web service requests and from triggers on objects.

Full SFDC docs here

So APEX is value-add I can create in my SFDC environment - here is a simple example -






















I have created the APEX class - NiallCEmailManager and have tested it.





















I now need to amend the code to expose it as a web service -
















I now check for the APEX class in SFDC - Setup

















I can now generate the WSDL -




















and  save locally -
















Create the Integration in OIC


The integration has a REST trigger.
Request contains a valid email address.

SFDC Connection configured as follows -














Mapping -














I have hard-coded subjects and messages.








addresses is set to Source email

I activate and test via Postman













#724 OIC - Salesforce Adapter - Bulk API

$
0
0
So what is this for?
It allows one to perform Bulk Data Operations:
Inserts, updates, upserts, or deletes of a large volume of records.

For those interested, the SFDC Bulk API Doc is here

















An extract from the aforementioned -

Bulk API is based on REST principles and is optimized for loading or deleting large sets of data. You can use it to query, queryAll, insert, update, upsert, or delete many records asynchronously by submitting batches. Salesforce processes batches in the background.

SOAP API, in contrast, is optimized for real-time client applications that update a few records at a time. You can use SOAP API for processing many records, but when the data sets contain hundreds of thousands of records, SOAP API is less practical. Bulk API is designed to make it simple to process data from a few thousand to millions of records.

The easiest way to use Bulk API is to enable it for processing records in Data Loader using CSV files. Using Data Loader avoids the need to write your own client application.

Data Loader is an app you can download from your Salesforce instance -





















Leveraging the Bulk API via OIC -


First, an extract from the Oracle SFDC  adapter docs -

The Salesforce Bulk API enables you to handle huge data sets asynchronously with
different bulk operations. For every bulk operation, the Salesforce application creates
a job that is processed in batches.
A job contains one or more batches in which each batch is processed independently.
The batch is a nonempty CSV/XML/JSON file that is limited to 10,000 records and is
less than 8 MB in size. Because the batches are processed in parallel, no execution
order is followed. A batch can contain a maximum of 10,000,000 characters in which
5,000 fields in a batch are allowed with a maximum of 400,000 characters for all its
fields and 32,000 characters for each field.

Here is a simple example where I will read a file of Leads from an ftp server and leverage the Bulk api to insert them into SFDC.




BulkLoadLeads is configured as follows -





I upload a test csv file to my ftp server -










As this is a scheduled orchestration, I click Submit Now -











As you can see, this is the second run -












I go to Monitor Bulk Data Load Jobs in Salesforce -

























#725 - OIC 19.3.1 July 2019 New Features

#726 OIC - using the Google Calendar adapter

$
0
0
Simple example of the above -













Pre-Requisites



First step is to do the pre-reqs defined in the Google adapter doc here

Navigate to https://console.developers.google.com/apis

Click on Enable APIs and Services












Select Google Calendar -













Once the API is enabled - then navigate to Credentials - creating a web app.












Credentials include -

Client ID
Client secret

You also need to set the re-direct URI to

https://yourOIC:443/icsapis/agent/oauth/callback























Create the OIC Integration 

Create the connection, leveraging client ID etc.




















Note the Scope: this can be set to either -
https://www.googleapis.com/auth/calendar or
https://www.googleapis.com/auth/calendar.readonly


This simple integration creates an event in my Google Calendar -

























Here is the configuration of the adapter -





















I map the following request fields -




















The response mapping is as follows -

Note: I have filtered on the mapped fields -















I activate and test -












The response is as follows -











I consult my google calendar for September 20th -






















Google Calendar API docs are here


#727 OIC - Integrations leveraging Process for Error Handling Part 1

$
0
0
Here is a simple example of leverage Process for Human Intervention in Error Handling.

I have an integration that uses the connectivity agent to write to an on-premise file.
The use case is simple - json request contains customer details and these are simply written to a file.






































CreateCustomer in the action that invokes the File Adapter.

The Global Fault Handler is configured as follows -



















As you can see, the fault handler is calling a process.
This process will display the Error and the Customer payload.
The use case here - the person to whom this task is assigned views the error, and, if possible, takes corrective action.

In my simple example the connectivity agent is down. She restarts the agent and then res-submits the customer data to the integration.

Now to a test -


As you can see, the request to OIC has timed out.

Here is the Instance Tracking in OIC -




















































I login to Process Workspace and see the following task -







































The error message is salient and to the point -

No response received within response time out window of 260 seconds. Agent may not be running, or temporarily facing connectivity issues to Oracle Integration Cloud Service.

I re-start the agent -













I then click Re-Submit -

























Customer data is processed and the file is written -










Ok, this may not be the most appropriate use case for the  process user.
So how about one that is.

I have a DB table - Customers -







working with the customer data from the previous example - we will insert a new record into the customer DB, mapping custnr to cust_id.

The integration has been amended as follows -

I added a scope for the DB Insert - I also added a call to our error handling process in the Scope Fault Handler -






















I test with the following payload -












This returns a http 200 - OK.

I check in OIC Monitoring -
















However, when I view this -

























This is caught by the Scope Fault Handler -




















Ergo, the process has been called.

I check Workspace -

























Apologies for the error message in German, aber so ist das Leben!
the name value is too large for the column.

Now I can fix this error and re-submit.


























Now the issue is 2 files have been written -










So let's change the integration somewhat and leverage only a Global Fault Handler, no more SCOPE















I test with the following payload -












this time I get an http 500 response in Postman


























The Process has been called - I fix the name length issue and re-submit -

























Integration completes successfully -


















File is written -


















DB is updated -












#728 OIC CI/CD with Flexagon

$
0
0
Comprehensive Oracle Partner offering for CI / CD for Oracle Integration Cloud.
Check it out here


















#729 OIC AQ adapter

$
0
0

















Queue Setup in Oracle Advanced Queuing


First step was to set up the Q in AQ.

CREATE type Message_typ as object (
subject     VARCHAR2(30),
text        VARCHAR2(80)); 

EXECUTE DBMS_AQADM.CREATE_QUEUE_TABLE (queue_table => 'objmsgs80_qtab',queue_payload_type => 'Message_typ');

EXECUTE DBMS_AQADM.CREATE_QUEUE (queue_name => 'msg_queue',queue_table => 'objmsgs80_qtab');
EXECUTE DBMS_AQADM.START_QUEUE (queue_name => 'msg_queue');

I then created a procedure to create a message -

CREATE OR REPLACE PROCEDURE P_AQ_ENQ AS
  enqueue_options      dbms_aq.enqueue_options_t;
   message_properties   dbms_aq.message_properties_t;
   message_handle      RAW(16);
   message             Message_typ;
BEGIN
   message := message_typ('NC MESSAGE','Gruess Gott von AQ');
   dbms_aq.enqueue(queue_name => 'msg_queue',enqueue_options => enqueue_options,message_properties => message_properties, payload => message, msgid => message_handle);
   commit;
   END;

Create the Integration in OIC

simple use case, de-queue message and write it to a file.


























AQ getMsg configured as follows -


























I set Tracking -















I now execute the plsql procedure to enque a message -













I check my ftp directory -













I check out the Monitoring/Tracking screen -













simple and succinct.




#730 OIC - Google Tasks adapter

$
0
0
Here is a simple example of leveraging the adapter -
















Naturally, I begin by creating the connection.














There are some pre-requisites for leveraging the google task api from OIC.
You need to create the OAuth credentials.
This you do via console.developers.google.com

















I click Enable -











Now I click Create Credentials -

and generate the OAuth Client ID - (Client ID/ Client Secret)













I will also need to specify the scope - e.g. read only or read/write.























I add my client id and secret and then click Provide Consent -





















I click Allow -











I now switch back to the OIC Connection definition and click Test and then Save.

Now to using it in an integration -













I have a REST interface with the following Request/Response -

{"taskList":"myTaskList", "taskName":"task", "description": "desc",
"dateDue": "2019-10-12T23:28:56.782Z"}

{"taskListId":"ABC", "taskId": "123"}

Essentially, I will be creating a new tasklist and adding a task to it.
The response contains the ID of tasklist and task.

I drop the google Task connection into the integration -




















and select Insert Task List - this will add a new Tasklist to my existing list -

















I then drop the google task adapter again and select Insert Task -
I will add the tasklist ID to the the Insert Task Request, to create the link between the two.


Maybe this is a good time to look at the task api docs here

















So back to the integration -

























Here is the mapping for createTaskList -













I set kind = "tasks#taskList"

title is set to the request field - taskList

Here is the mapping for createTask -























I set kind = "tasks#task"

Note how I set the tasklistId under TemplateParameters.
This appears under TemplateParameters because the url is as follows -




This is set to the ID returned by the createTaskList request.





















I activate and test with the following payload -

{"taskList":"myTaskList", "taskName":"Important Task", "description": "sehr wichtig!", "dateDue": "2019-10-12T23:28:56.782Z"}

I check in Google Tasks -



#731 SOA Suite 12.2.1.4.0 Documentation available

#732 Fusion ERP Batch Extracts with OIC

$
0
0
Simple example and background explanation here.
Caveat to begin with - I am not a Fusion ERP expert, but I do like understanding how things work.

So what is a batch extract?
Essentially a scheduled process (ESS- Enterprise Scheduler) that runs a report in ERP e.g.
Extract payment data since last execution to update downstream or upstream applications to reflect payments.

So two components here - the scheduled process and the report to be run, according to that schedule.

Where can I monitor the execution of these scheduled processes in Fusion ERP?










Here is a job I ran earlier, triggered by an OIC integration
(we will look at the integration in detail later) -















Payables Transactions Extract or sliocht idirbhearta iníoctha, as we are wont to say in Ireland.

However, there are many more of them -




















A scheduled process might have parameters that you can set to control which records are included or how they're affected. For example, a process updates only the records that are effective within the date range that you define.

So where are these ESS processes or jobs defined?










Search for Manage Enterprise Scheduler Job Definitions -



















As you can see from the above, I then searched for Payables Transactions Extract -

The job is of type - BI Publisher. That means there is a BI Report defined already.





























Here are the BI Report details -























So net, net is you begin with a BI Report, either one of those delivered with ERP or a custom report of your own. You then call this from an ESS job.

I have a simple example which covers the whole lifecycle -

The interaction is as follows -

1. OIC Trigger integration: Trigger the job to execute the report.
The job payload will also specify a callback url - which enables a 2nd integration to be called once the job has run successfully and the result is in UCM.

2. Fusion ERP: The report executes

3. Fusion ERP: Result is written to UCM

4. Fusion ERP: Callback Integration is triggered

5. OIC: Result is picked up from UCM and sent to destination.

OIC Integration triggering BI Report run








initiateExtract: Here we will leverage the Fusion ERP exportBulkData operation. This submits an ESS job to start the BIP report processor and eventually uploads the report output to UCM. The supported output formats are XML and CSV. Callback and notification are also supported.

The Request Payload - 













The parameter list refers to the PaymentsTransactionsExtract: Parameters screenshot above.

Here is a sample input for the parameter list -

92,/oracle/apps/ess/financials/commonModules/shared/common/outbound;PayablesTransactionsExtract,BIPREPORT,FULL_EXTRACT,#NULL,300000046987012,#NULL,#NULL,#NULL,#NULL,#NULL,12-18,N,N,300000046975971,#NULL,#NULL,#NULL,FULL_EXTRACT,#NULL,#NULL,#NULL,PayablesTransactionsExtract,#NULL

jobOptions will be set to ExtractFileType=ALL

notificationCode is set to 30














callbackURL is set to 
nn/ic/ws/integration/v1/flows/soap/DEMO_ERP_BULK_EXTRAC_CBK/1.0/'

this points to my callback integration, more about that later.


The Response will be the requestId of the submitted ESS job.

I execute the integration and view progress in ERP - Scheduled Processes -










I check the response received from the initiateExtract call -

























Note the requestId returned - 1615660.

OIC Callback Integration - picking up result from UCM


This integration picks up the extract result from UCM.
Remember, we have passed the url to this integration as a parameter in the invoke of exportBulkData 






Let's look at the individual steps -

bulkExtractCallback:
The trigger is based on a SOAP connection, which itself is based on the following wsdl -





























As you can see, the request contains the following -

 "requestId", "state", "resultMessage"

























Note: requestId = 1615660.

downloadFile: leverages a SOAP connection, based on the Fusion ERP wsdl


















The operation selected is getDocumentsForFilePrefix.

Details can be found here -


























prefix is set to “ESS_” + //requestId
makes sense!

account is set to fin$/payables$/export$

comments is set to "processedby=" , //requestId  

The Response is as follows - (this is from a different run )












The file is written to my ftp server -





#733 Fusion ERP Batch Extracts with OIC part II

$
0
0
Just some more background info -

I used the following connections to Fusion ERP in the previous post.









Both of these are used in the Callback Integration -
The first, for onJobCompletion, i.e. the integration trigger.
The second, to pick up the report from UCM.

The first connection is based on the following wsdl -

















The second connection is based on the following Fusion ERP wsdl -

https://yourOIC/fscmService/ErpIntegrationService?wsdl

The ErpIntegrationServiceprovides external operations for ERP integration scenarios to execute end-to-end inbound and outbound data flows. It also tracks the status of inbound and outbound data processes.

























Documentation of the above here




#734 Fusion ERP Batch Imports with OIC

$
0
0
Simple scenario here -

I have an FBDI compliant AP Invoices file in a folder on my ftp server
OIC leverages the ftp adapter, reads the file and inovkes the import to ERP
OIC receives the callback from ERP on job completion.


FBDI - File Based Data Import.
Fusion ERP provides FBDI templates for AP Invoices, Suppliers, GL etc.
Fusion users can upload such files via -


































Here is an example of such a file -










I generate a zip from this as follows -




























Note: I also needed to add the above .properties file to the zip.

Check out Jack Desai's excellent blog post for more info on this.
As Jack says -
The property file submitted with importBulkData operation includes the job definition and package names as well as the job parameters of the object being imported. You must generate and add the Job Properties File as part of the data ZIP file.


So how do we implement the import in OIC?







Simple, leverage the ftp adapter to read the AP file and then Fusion ERP adapter -















































Now to a test...














Note the process id returned by Fusion ERP - 1615742.

I validate in ERP -
























Now to the callback integration...













Again, I am leveraging the ERP adapter - this time as trigger.




































Here is what I see in Fusion ERP after running an Import -








The trigger payload is as follows -




































Note the Transfer, Load and Import steps above. Also the Upload to UCM.



















#735 OIC October release new features


#736 OIC SOA Suite Adapter available as feature flag

$
0
0
The feature flag is called -
oic.cloudadapter.adapters.soaadapter

This can be activated on your OIC environment via an SR.
















example to follow.
Just downloading the latest and greatest SOA Suite version 12.2.1.4.
from here

That was quick - SOA Suite Quick Start downloaded and configured/started within 30 minutes.














#737 OIC - CPQ Integration

$
0
0





Here is a short introduction to the OIC CPQ adapter.

As a pre-requisite, I suggest you read the Oracle CPQ Cloud - Service descriptions
It is available here

Now back to the OIC CPQ adapter -

A short introduction from the adapter docs - available here

-----------------------------------------
The Oracle Configure, Price, and Quote (CPQ) Cloud Adapter enables you to create
an integration with an Oracle CPQ application.

The Oracle CPQ Cloud Adapter enables you to convert sales opportunities into revenue
by automating the quoting and sales order process with guided selling, dynamic
pricing, and a workflow approval process.

Oracle CPQ cloud extends sales automation to include the creation of an optimal
quote, which enables sales personnel to configure and price complex products; select
the best options, promotions, and deal terms; and include upsell and renewals, all using
automated workflows.

The main use case for Oracle CPQ Cloud is as the trigger (source) in an integration in
which Oracle Sales Cloud is the invoke (target). This adapter replicates the point-to-point
integration that exists today between Oracle CPQ Cloud and Oracle Sales Cloud.

Oracle CPQ Cloud is the trigger (source) of the record application. A synchronize
process is triggered as you update and save data in Oracle CPQ Cloud, but it can also
be configured by the administrator of the application. The Oracle CPQ Cloud Adapter
can also be configured as the invoke (target) in an integration.

Prebuilt integration flows with Oracle CPQ Cloud and Oracle Sales Cloud for quote
creation, opportunity import, and quote update are also provided from the Oracle Marketplace.
--------------------------------------------------------------------------------------------------------------------

Pre-requisites

One must obtain the CPQ SOAP wsdl from CPQ itself - so login and click on Admin.































Select version 2.0














Ensure you are in the Commerce tab -










Just append ?wsdl to the url shown.


















Use the wsdl and your CPQ credentials in defining the CPQ Connection in OIC -





























OIC CPQ adapter also supports REST -

You simple add /rest/version/metadata-catalog to your base CPQ URL

The doc mentions v3 and above.
I found my CPQ is on v8. To do so, I clicked on Admin and then


I then checked out the version number -













Using CPQ SOAP Connection as an Integration Trigger















The input is a Transaction object from CPQ.

Again, from the docs -

This object is from the CPQ commerce process. This is the business object that
you receive from the Oracle CPQ application as a request document to start this integration flow. 
This business object is automatically selected based on the content of the WSDL
file you specified when creating the Oracle CPQ connection.

Response is pre-set.


























The Transaction is as follows -


























Note: you cannot use the CPQ REST Connection as a trigger.


Using CPQ SOAP Connection as an Integration Invoke






















Use the CPQ REST Connection as an Invoke






















CPQ REST api doc here




#738 OIC - CPQ Integration part II

$
0
0

Note: Everything in italics is from the CPQ online help or other CPQ docs.

Before looking more deeply at how one integrates with CPQ, let's look at the product itself -

CPQ in 5 minutes...









CPQ has its own scripting tool - BML -

BML (BigMachines Extensible Language) is a scripting tool that is used to capture a company's complex business logic within CPQ Cloud Configuration and Commerce.


























More in-depth info in the CPQ eBook here

In a nutshell -

Oracle CPQ Cloud enables both enterprise and midsize companies to streamline the entire opportunity-to-quote-to-order process, including product selection, configuration, pricing, quoting, ordering, and approval workflows.


1. Configure -Create valid product configurations with user-friendly, dynamic interfaces – whether your customer needs simple bundles or more complex “engineered-to-order” scenarios.

2. Price - Perform automated pricing calculations, apply discounts, streamline approval processes and validate quote information – all within minutes!

3. Quote - Empower your sales team members to create professional proposal documents, including current product and pricing information. Oracle CPQ Cloud Document Designer dynamically fills current data into proposal documents with just one-click. 

Integrate CPQ Cloud to your CRM, ERP or other web services to streamline, automate and govern the entire end-to-end sales process.

Oracle CPQ Cloud is a robust solution on its own, but when seamlessly integrated into the entire end-to-end sales-to-order process, CPQ Cloud bridges the gaps between your CRM and ERP systems. Optimize the sales process and cut down cycle times while increasing overall margins. When it is time for the order to be handled to the back office, essential information flows seamlessly to your ERP application in the manner that best suits your organization and business needs. 
















Note the seamless Interaction with Back Office Solutions -

For CRM, this includes Oracle Engagement Cloud, SFDC and MSFT CRM Dynamics.
For ERP, this includes Oracle ERP, Oracle EBS, JDE, SAP etc.

The CPQ documentation includes sections on the following -



























CPQ Integration


Let's check out the OIC section -









CPQ Cloud Integration Center enables me to define connections to OIC.

So let's check out the Integration Center -

The Integration Center is a page for managing integrations with CPQ Cloud. There are a number of integrations managed through the Integration Center platform, including OIC












































So now we have the connection to OIC.

Next to CPQ Integrations -

To use the integration you just created in the Integration Center, you must create an Integration on an existing Process.

So what are CPQ Processes?

Commerce Processes are used to create templates for the selling processes used by your company.

With Commerce Processes, you set up your quoting, ordering, approval, and other workflow processes. By creating ordered sets of Commerce document templates, along with associated attributes and actions, you enable buyers and supplier agents to conduct transactions on your customized CPQ Cloud application.

One can edit processes and add integrations












































Note the Endpoint URL is the Oracle Engagement Cloud SOAP Service -
/crmService/SalesPartyService.

However, integrations can also leverage the OIC connection -















CPQ Transactions

The previous post showed us leveraging the CPQ adapter.
The business object passed in the trigger example was a Transaction -

So what is a transaction in CPQ-speak?

A Transaction Manager is a list of Transactions or quotes that you can access. This is most frequently where approvers will go to approve quotes and where you can download your quotes or Transactions.

Ok - Transaction = Quote.

#739 Creating OIC instance

$
0
0
Got a call from a someone asking how to do this on a free trial.

Step 1. cloud.oracle.com - enter your tenancy details - then your user/pwd
Step 2.

























Step 3.













Step 4.















Step 5.












That's it - now watch the magic happen...














































#740 Oracle Integration now available on Oracle Cloud Infrastructure Gen 2

$
0
0








Since last week, new Oracle Cloud accounts can create OIC instances on OCI Gen2.

This is available in the following data center regions -














Other Data Centers to follow.


The step by step creation instructions are available here

So what does Gen2 give me?
I will go into more details in a future post - but, to begin with -

1. OCI Gen2  Compartments - create your OIC instance in your own compartment for administrative Lifecycle Mgt isolation.

2. OOTB integration with OCI services such as Events, Functions, Streaming and Notification.

3. OIC integration in OCI One Console





Viewing all 874 articles
Browse latest View live