Monday, 8 July 2013

Fusion Apps - File Based Loader Part - III

In the first two blog posts (PART I & PART II) I have explained how to create HCM Business Objects Mapping file and use this file to create data file for loading into Fusion. I also explained how to create the zip file and register that using composite. Now as I mentioned in the last post that when you call and submit the composite to register the file, the composite actually moves the file to a path and then call hr2hrSqlldr.py  python script to load the data into staging tables.

Let's see what are the series of actions it does. If you open the python script you will find that it first checks what are all the input details provided by the user when he/she called the composite from a table named HRC_XFER_JOB_PARAMS. From this table it gets the list of LBOs that were loaded in the zip file. Now user may load all or few LBOs which are supported by FBL. In order to maintain the data integrity it has to load the LBOs in a a pre-defined sequence. To do this it gets the processing sequence from a table HRC_LOADER_METADATA_REGISTRY. The next screen shows the loading sequence from this table.
LBOs are processed starting with the one having the largest processing_order value. Oracle generates separate JOB IDs for each of the top level LBOs which were specified in the composite parameters. Then it creates SQL*Loader Control file for loading data into HRC_LOADER_BATCH_HEADER and HRC_LOADER_JOB tables and sets the LINE_STATUS = 'Ready To Process'. Once this is done it gets all the data files for each of the LBOs and then creates SQL*Loader Control file dynamically for them to insert into HRC_LOADER_BATCH_LINES_SRC table. Now once all data files have been imported into the table, it then calls a PL/SQL procedure HRC_SQLLOADER.PROCESS_BATCH_TRANSFORM to do the data transformation for each of the top level LBOs.

Now for some reasons it does not insert data into those tables in my instance. Don't know why. May be I might be doing something worng. So I logged SR with Oracle and initial response was that our version(11.1.4.0.0) is not the latest one where FBL works as expected.  Let's do this by running Load HCM Data
 Click on Load HCM Data under Coexistence.























Click On Schedule and select the file you loaded. I have already a loaded file Location28.zip. So let me show you the steps with this file.






Select the file, LBO and run mode. Now here I want to do - Import into staging tables and then load them to base tables in a single run. So I selected Import and Load Batch Data. You can do Import first and then load them through Load Batch Data process. Click on submit.


 It submits an ESS job which them submits child processes.









Click Refresh to see whether the process is completed.











 Click Processes tab to see whether it has any error.











 Click on the Batch Name Link as shown.











 As can be seen that it has one record and it failed. Click on the 1 under Failed Objects.






 
 Once you do that it opens a screen which looks like this. Expand Location LBO and select the data to see the error.








Here the error is 'Location code already exists'. I intentionally took this address to show the various features. Basically I created a Location from Fusion Application Navigator-->Workforce Structures->Manage Location



 This picture shows the error details.











I used that location code to create data file. Wanted to explore whether it is possible to update the data which are not created by FBL. So you can see with this process you can not update data which are not created by FBL. Recently in release 5 onwards Oracle has introduced Business Object Key Map HCM Extract to generate the GUIDs for business objects created in application manually. Will explain the process of doing that in a separate post after completing this series.

Many of ou may be wondering why did it try to insert in stead of update. 
Oracle Fusion HCM maintains a Key Mapping table HRC_LOADER_BATCH_KEY_MAP that records, for each business object, its type, source GUID, and Oracle Fusion ID (the ID by which it is identified in Oracle Fusion). For example:
OBJECT TYPE  SOURCE GUID ORACLE FUSION ID

---------------  ---------------  ----------------------
Person              PERS123          006854
Person              PERS456          059832

When a record is imported to the Load Batch Data stage tables, the import process compares the record’s source GUID and object-type values with values in the Key Mapping table:

  • If the values exist in the Key Mapping table, then the process replaces the source GUID in the stage tables with the Oracle Fusion ID.
  • If the values do not exist in the Key Mapping table, then an Oracle Fusion ID is generated for the record, recorded in the Key Mapping table, and used to replace the source GUID in the stage tables.
So for my case I did not know the GUID created by application and also there was not entry in the key mapping table. So it generated a new fusion id and marked the record as Create not Update.

Let's see this process. I am going to update the Location code from the application.
  












 The record details.












 MEHABUB_LOC is the location which is there in the system. Let's change this to MEHABUB_LOC1









 Then set the Batch status to 'Ready to Process'











 Rerun the Load Batch Data Process.











 Click on Run.












 Click on Submit.












 Note down the Process ID. This is nothing but ESS job.











 Wait till the process is completed.











Now you can see that it has created MEHABUB_LOC1 location in fusion.










 This picture shows that the location is now available in application. I am going to use FBL to update the address details of this location.
 I just changed the address details keeping location code and GUID same and created a new data file Location29.zip and called the composite


 This picture shows the composite call parameter details.










 Composite Flow.












 Submit Load HCM Data with Import and Load Batch Data for the file Mehabub29.zip.










 ESS Job id.












 Wait for the process to complete.











 All process completed successfully.











This shows that one object has been changed.
Let's check from the application.









The marked address is what I changed in my data file.










So this completes the series of FBL data loading as a conversion tool.

In summary from this post you know how
  1. FBL python script creates Control files
  2. Oracle derives the processing order of the LBOs
  3. Data transformation happens inside Fusion 
  4. To Load data into staging tables
  5. To get the errors and fix them from Application
  6. To create data file records created with FBL to update
In the next post will discuss how to use FBL as ongoing sync tool using Business Object Key Map Extract . Till then keep trying FBL for different business objects. Please go through the MOS documents mentioned in PART I and PART II.

Cheers........................
 

 

Wednesday, 3 July 2013

Fusion Apps File Based Loader (FBL) - Part II

In my previous post I explained how to generate cross reference files from Fusion. I will use those files to generate my data file. You have to create data file as per the format which can be read by FBL. The file must be  '|' delimited flat file. It should have column which are supported by FBL. 

Now the question is how will you know what are the columns it looks for. Oracle has published a column mapping spread sheet which is there at Data Conversion Reference Library. If you do not have access to it, then as per the MOS document Oracle Fusion Human Capital Management File-Based Loader V1.1 [ID 1533860.1] ask your COE to send an email to Oracle. This has all the required resources for HCM Data Conversion.
One of the important files, is the column mapping spread sheet which you can get it from here. You need to filter the business object in that spread sheet. For my case it is Location. So I filtered it and it looks like the next picture.
This spreadsheet explains what should be the columns present in the flat file. You can maintain any order of the columns in the flat file but the first row of your file should be the column names.
Here is the sample data file which I used for my test.



If I open my data file in excel, it looks like this.

You can see that I have provided LOCATION_ID a value from my side which has nothing to do with LOCATION_ID stored in PER_LOCATIONS table in fusion. Only constraint is that it has to be unique. So you can have your own naming convention for this. Next as per the column mapping spread sheet the SET_ID value has to be taken from the  SET_ID cross reference file. You will see that file in the in zip created as per my first post. Also keep in mind  that date fields must have date format specified in the mapping sheet. If you do not maintain the format you are going to get the errors while loading the data in fusion. Here for location it has to be YYYY-MM-DD. Once you create the file you have to name it as per the Fusion standard. For location it is F_LOCATION_VO.dat. Create a folder named Location in your machine and put the F_LOCATION_VO.dat file inside it and then zip the Location folder and name the zip file whatever you want. I named it as LocationMehabub28.zip. You can specify any name you want.
You can load multiple business objects at a time. If you do that then you have to create separate folders for each Business object and inside each folder you have to put your data file with related objects like the next picture

Once you are done with putting all data files created as per the column mapping spreadsheet then take all folders and zip them. This zip file is going to be the file which will be uploaded to Fusion.


The flat files will be having naming convention as F_<BUSINESS OBJECT>_VO.dat.

One of the biggest challenges in FBL is creating these flat files. If you want to convert all objects in one shot then you have to create the relationship yourself. For example if you want to create departments along with location then you will create location data file by providing LOCATION_ID a unique value and then passing this value to LOCATION_ID column of department file. In this way you create the referential data integrity in the flat file itself. When you load the zip file in Fusion, it first extracts the individual business objects and then dynamically creates the loading order as per the data model created in HCM. There is a table which stores this loading order sequence of the HCM Data Model. So whenever you load business objects through FBL oracle knows which objects need to be converted first and accordingly it creates mapping information with data file and fusion ids. 
Here is the link from where you can get the sample data files for each business object. The next step is to register the zip file in fusion. This is done by calling a composite HcmCommonBatchLoaderCoreInboundLoaderComposite. If you are a cloud customer oracle will give you the URL for the composite. But if you have on-premise installation, then log on to your Oracle Enterprise Manager Fusion Applications Control and search the composite under Farm_HCMDomain -> SOA->soa-infra-->default as shown in the next picture.













Copy the URL and paste it in mozila firefox










Once you call the composite it will look like this
Here under payload section you specify any unique value for RunControlID and ProcessInstance.
For ZipFileName specify the zip file which you created. For me it is LocationMehabub28.zip. For FileLocation, if you are SaaS/Cloud customer then Oracle will provide you the SFTP path where you need to copy the zip file. Generally you will put your zip file under ftp_inbox folder of the SFTP path. But if you are on-premise customer then your system administrator has to setup the SOA FTP Adapter. The instructions are documented in Fusion Applications Post-Installation Guide. Once your administrator has setup that, he can give you the host and unix path where you need to copy the zip file. For me the path has been setup as /home/appsfa/E_1. As I have copied the file under ftp_inbox, I have specified the FileLocation as ftp_inbox.
The next picture shows that I have put my file there.












Now let's see how will I tell Fusion how many Business Objects are there inside the zip file.
As shown in the picture you specify the Business Object names under LBOList section of the payload. As I am only loading location, I have specified Location as the LBO. If you have multiple, then click on + sign to add them.



Now all these web-services are having security policies associated with them and you have to specify the User ID and password under WS-Security Section. This is the application user id and password who is having required privileges.

Once you submit you should get the No Response  as shown in the next picture

This means the composite call is successful. Check in unix box the file should be removed from ftp_inbox.




















Now many of you might be asking where did fusion put this file in the system and what does it do with this file. In order to get the details you need to have some knowledge about SOA. I assume that you have some basic idea about that. So let's get into the business of tracing the flow. This section applies to On-premise customer because in SaaS environment you will not have EM access.

Log on to Oracle Enterprise Manager Fusion Middleware Control and search the composite HcmCommonBatchLoaderCoreInboundLoaderComposite which you called.
Under dashboard you will get all the recent instances of the of the service.










Click on InboundLoaderProcess.
You will see something like the next picture.
This means that it has actually moved the file to /u01/ODI_FILE_ROOT_HCM/E_1/X_300000001133519/stage folder. You can cross verify that from unix.
It also calls a python script to unzip the files, create the loading sequence of the business objects and then creates control file for each LBO.
The following picture taken from Oracle Enterprise Manager Fusion Applications Control shows which script it calls and where we can find that script
We will see what this python script hr2hrSqlldr.py does in detail in the next series Part III.









The following picture confirms that the composite process has successfully completed.
In the next post I will explain how to import the data from this file to Fusion Tables.
















So you now know how to create the data files, where to put that and how to register the files in Fusion.
There are few documents about FBL - FBL Overview, User Guide and Step-by-Step instructions in Data Conversion Reference Library.

Please go through them to have solid understanding.

Cheers.

Saturday, 29 June 2013

Fusion Apps - HCM File Based Loader (FBL) - Part I

File Based Loader is the inbound integration tool. Oracle has also HR2HR but that is for integration between EBS/PSFT and Fusion. Let's see what are these two.

HR2HR:
HR2HR is designed to allow a source system of record (typically one of the applications unlimited products) to co-exist alongside a target Fusion system, allowing data to be iteratively transferred from the source to the target ensuring that Fusion is synchronized with the system of record. Typically this is used by customers who want to take advantage of new functionality provided in the Fusion Talent, Compensation, Core HR offerings, whilst maintaining their source system as the system of record for reference and employee data.
In HR2HR there are predefined extract mechanisms that are built for EBS and PSFT. These generate a snapshot of the data that exists in the system of record. This is then moved onto an SFTP server and Fusion is notified via a webservice call that such a file is ready for processing. Fusion then retrieves the file from the SFTP server, unzips it and initiates processing. In the 1st stage of processing it is passed through an ODI layer where the data is compared against the existing Fusion data and a series of records is written to the staging tables that represent the specific actions that need to be taken on particular data objects in order to synchronize them with the system of record. These changes might be creates, updates, deletes or data effective changes.

File Based Loader:
File Based Loader is designed to support the same set of objects as HR2HR but where there is not a predefined source system of record. The intention is that this will be used by customers who are only performing an initial data load and who want to bulk load a large volume of data including history. Unlike HR2HR, File Loader simply takes the data from the zip files and loads it directly into the data loading stage tables rather than going through an ODI layer. It then processes the staging data to resolve the primary and foreign key references.
Because File Loader does not have a dependency on ODI it has been possible to extend the number of attributes that are supported within each object, and so for example descriptive flexfield columns are supported in File Loader whereas they are not available in HR2HR.
However, as File Based Loader does not have any pre-defined extracts to populate the data files it is necessary for implementation teams to develop their own extract mechanisms and to ensure that the data is in a form that will satisfy the Fusion business rules.
File Loader supports both data creation and updates. It also supports the full creation of history as well as processing terminated work relationships.


So in short 

  • Use FBL to Load HR Data from your legacy system to Fusion HCM
  • It is a flat file based data upload utility
  • It validates and loads data into Fusion
  • It supports error correction
  • It loads complex data objects with full date effective history
  • You can use multiple data sources to prepare the data file
  • Can be used as Application to Application Integration
  • Can be used as Conversion Tool
  • Can be used for on going data sync
  • It is mainly used for Bulk data loading
  • Can be used to update business objects provided they were created using FBL. At the end of this series I will show that data created in Fusion through UI can not be updated using FBL.



How File-Based Loader Works
File-Based Loader uses the Oracle Fusion HCM Load Batch Data process to load your source data to the Oracle Fusion application tables. Load Batch Data is a generic utility for loading data to Oracle Fusion from external sources; for example, it is used by Oracle Fusion Coexistence for HCM, the Oracle Fusion Enterprise Structures Configurator (ESC), and the Oracle Fusion HCM Spreadsheet Data Loader. A key feature of Load Batch Data is that data is imported initially in object-specific batches to stage tables. Data is prepared for load in the stage tables and then loaded from the stage tables to the Oracle Fusion application tables.
For this process to succeed, you need to deliver your data to File-Based Loader in a format that satisfies Oracle Fusion business rules. The Oracle Fusion Human Capital Management File-Based Loader V1.1 User’s Guide explains how to meet this requirement. The MOS note is Oracle Fusion Human Capital Management File-Based Loader V1.1 [ID 1533860.1]

What are other tools available
Oracle has a documented these under HCM SaaS Integrations. Here is the slide which talks about that

What is the latest version of FBL
The latest FBL version is FBL1.1 and is available from 11.1.5.0.0.

End to End Picture

This picture shows the full flow of FBL data loading process. The process starts from Fusion HCM. In order to successfully load data you need to perform the following steps.










1.User Performs Steups in Fusion HCM - Select and Configure Offerings like Workforce Deployment, Workforce Compensation depending on your requirements and business process flow. Create Implementation Projects and complete all the configuration tasks.
2. Generate Mapping File and Send to Source System - Once the configuration steps are completed you need to create reference mapping file which will be used while creating actual data files. In order to do that you need to first specify the configuration parameters. Go to Administration --> Setup and Maintainace.
Click on All task tab and search for task Manage HCM Configuration for Coexistence as shown
Click on Go to Task. Which will open the screen like the next picture








Click on Detach to see the full table. Specify the values of all the parameters. Below are the details of the parameters. Depending on your version you will see some extra parameters. For me it is 11.1.4.0.0 and FBL 1.0



PARAMETER NAME PARAMETER VALUE
Environment Properties File /u01/APPLTOP/instance/ess/config/environment.properties
HRC Product Top /u01/APPLTOP/fusionapps/applications/hcm/hrc
Load HCM Data Files Automatically N
Loader Cache Clear Limit 99
Loader Chunk Size 200
Loader Maximum Errors 100
Loader Number of Processes 1
Loader Save Size 1
ODI Context DEVELOPMENT
ODI Log Level 2
ODI Language AMERICAN_AMERICA.WE8ISO8859P1
ODI Password (As supplied by Oracle)
ODI Root Directory /u01/oracle/ODI_FILE_ROOT_HCM
ODI User FUSION_APPS_HCM_ODI_SUPERVISOR_APPID
ODI Work Repository FUSIONAPPS_WREP
On Demand FTP Root Directory /u01/oracle/ODI_FILE_ROOT_HCM/ftp_inbox
Use Python Loader Y
User Name Expression Optionally, you can specify one of:
loaderCtx.getPersonAttr(“PersonNumber”)
loaderCtx.getAssignmentAttr(“AssignmentNumber”)
Note: If you have on-premise customer then replace APPLTOP with the actual path. Specify ODI password. All of these will be provided by your DBA who installed Fusion Apps.

Your parameters should look like this
Click on Submit as shown in the next picture to save this Configuration in the system.







Once you click you submit it will give you Confirmation and click on OK.






Now that configuration parameters are set and also Offering configurations are complete we need to generate the mapping file. From the same screen click on Submit from within Generate Mapping File for HCM Business Objects as shown



It will submit an Enterprise Schedular Job. Note down the job number. We may need this to debug and find your corresponding zip file name.






Click on refresh to check the status. If the process status is 12 that means your file has been successfully created. Any other status means you have some problems. To find out exact cause you have to check the ESS log from Navigator --> Tools --> Scheduled Process. If you don't find meaningful error detail then you have to check in Fusion HCM Domain EM console. I will explain in a separate post about how to check log files from EM. Once completed note the zip file name and log into unix box and go to ODI Root Directory/E_1
For Example the process ended in status 12 as shown.








For me the ODI root/E_1 is
/u01/ODI_FILE_ROOT_HCM/E_1. For SaaS customer this will done by Oracle or they can give you access where all these files will be copied.
Now use any FTP tool to copy the file. As shown in the picture it creates a separate folder for each run. The name is same as the numeric part of the zip file.


I used WinSCP to copy the file from unix server to my local machine.








You can now see the file in your local machine.








Unzip the file and you can see list of cross reference files generated by Fusion.







In my example I will show how to upload Location in fusion. And all the locations which I will upload will be associated with Common Set ID so that the locations are available for all Legal Entities.

You can upload all these files and store them in mapping tables using SQL*Loader. While writing extraction script you can dynamically get the information from there and create data files.

So till now I have explained

  • What is HR2HR?
  • What is FBL and how does it work?
  • How to configure Fusion to start using FBL?
  • How to generate Cross Reference Files?
  • How to get that from Unix and save it to Local machine?
In the following two series I have explained how to create data file for Location Business Object and how to register the file in Fusion for processing:
  1. Fusion Apps - File Based Loader Part -II
  2. Fusion Apps - File Based Loader - Part III

----------Updated on 7th July 2013  with link of the remaining two series...
Cheers.....................................



Tuesday, 25 June 2013

Fusion Apps SaaS Service Scenarios

Hosting Service scenarios for Oracle Cloud Fusion Applications services

Oracle has documented different service scenarios which customer may ask for if they want to use SaaS. While creating service requests for cloud apps you will be presented with a list of services that Oracle will support.

Following are the services that are provided:
Email Notifications - Enable or disable email notifications

IP Address WhitelistAllow access to Oracle Fusion Cloud Services to a list or range of specific client IP addresses for their services from corporate network.

Language Pack installation - Ask for additional language pack.

Personalize Logo - Personalize Fusion Applications with your company logo.

Production to Test (P2T) copy - Copy the Production environment database into a Test environment on a one-off basis.

Enabling Flexfields for OTBI analysis - You can configure DFF and create analysis/dashboard based on that. To enable the DFF for OTBI analysis you can create service requests.

FA Admin Password reset  - Request to reset password provisioned with IT Security Manager Job Role, Configure SSO -  Configure Single Sign-On (SSO) access to Oracle Fusion Cloud Services based on Identity Federation.

Environment Resizing - By default, test environments are sized to support enough concurrent users for basic use of the environment. In order to assure that your use of the test environment will not be limited on a go-forward basis, you should request the pod resizing service option for your test environment(s) early on in your implementation. Production environments are sized to support the number of users you have purchased

All of these are well documented in MOS note Hosting Service scenarios for Oracle Cloud Fusion Applications services [ID 1529174.1]