JMJ CLOUD
  • Home
  • Projects
  • Blog
  • About Us
  • Contact Us

Our Blog

Oracle ERP Conversions Made Easy with APEX (Part 3)

6/10/2020

0 Comments

 
Picture
​In our prior post, Oracle ERP Conversions Made Easy with APEX Part 2, we reviewed the tools available to us to load data into ERP Cloud and discussed some of challenges using them.
 
This final part of our blog series shows you how APEX will make your conversions simpler, faster and repeatable. 
​

APEX to the Rescue


By taking advantage of APEX’s built-in features for modern cloud-based applications, your conversion can be simplified and automated.   With APEX, you can:
  • Build a secure, multi-tenant, cloud application, delegating authentication to other entities such as Office365 or Gmail.
  • Build forms allowing users to upload files securely to an APEX application and review content using an Interactive Report or Interactive Grid.
  • Parse input files using APEX_DATA_PARSER in a variety of formats, for example CSV, Excel, XML and JSON.
  • Call ERP Cloud REST or SOAP web services using the routines defined in the APEX_WEB_SERVICE package.   For users of Autonomous Database, no additional wallet or Access Control List configuration is needed.   If you’re on Exadata Express Cloud Service, you’re also OK, though perhaps it is time to upgrade!
  • Enter Web Service Credentials and store them securely in APEX.   Credentials can then be maintained or set at the session level using the APEX_CREDENTIAL package.
  • Use APEX_JSON to work with JSON files used by many REST web services.
  • Zip FBDi files for import using the built-in package APEX_ZIP.
  • Use the rich functionality of SQL and PL/SQL to clean, map and transform data before it is imported.
  • Use APEX_JWT to work with JSON Web Tokens (JWT) issued by ERP Cloud.

If you don’t have access to APEX, this functionality is available FREE as Oracle provides you with two Always-Free instances of Autonomous Database via this link:   https://www.oracle.com/cloud/free.

If you haven’t done so already, sign up today as it isn’t often you get something for nothing!
Applying the APEX Toolset to ERP Cloud Conversion
An APEX application used to convert data to ERP Cloud can:
  • Stage and parse simplified source data from Excel XLSX spreadsheets.    APEX will accept other formats too (e.g. XML, JSON, CSV, PSV) but XLSX is more user friendly.  APEX will read the underlying numbers and dates in their raw form, regardless of formatting.
  • Reduce end-user data preparation of the number of fields required on an FBDi upload.
  • Automate data cleansing (for example, changing case, removing trailing spaces, applying or reading date formats).
  • Share data, files and results across users in a multitenant application.
  • Maintain mapping tables for data transformation per target environment.
  • Synchronize master data from ERP Cloud to validate data and mappings, for example lists of Legal Entities, Business Units or Chart of Accounts values.
  • Retrieve conversion data from ERP Cloud to validate that all data imported and produce reconciliation reports.
  • Create pre-zipped upload format files with a single button click, e.g. File Based Data Import, HCM Data Loader.
  • Automate data uploads using FBDi, HDL or Business Object Web Services.
 
Sounds good?   Let’s move on to look at some real-world examples.
Real-World Example 1 – Employee Conversion
A client recently needed help converting workers for an ERP Cloud project.     The number of fields to be populated was small, as Oracle HCM was being used only to support Expenses and Self-Service Procurement.  External systems were in place for HR and Payroll.
​
Workers would be provided to us in a simple spreadsheet format, similar to that below.
Picture

The Company field would act as a key used to derive additional fields such as Legal Entity and Business Unit.

The objective was to enrich multiple spreadsheets received in this format from each of the client businesses with the additional fields needed by Oracle, and then to map the data into the Employee/Supervisor HDL file formats.

​First, we built an APEX Classic Report Cards page to allow users to upload an XLSX spreadsheet into a database table.
Picture

This makes use of the APEX_DATA_PARSER package to quickly transform the Excel file into the database.

​A mapping page was built using an APEX Interactive Grid, allowing us to enter and maintain the mapping from Company to other fields required by HCM Data Loader:
Picture

​A key field for ERP Cloud Finance projects is the Default Expense Account, used by Expenses and Procurement.    As the Chart of Accounts structure could vary depending on which Ledger the Business Unit is linked to, the mapping simply populates any segments provided in the source file, then fills in any missing segments based on the Company.   In this case, the Company, Division, and Department are in the source file, and the Account is defaulted in from the Company mapping.

The mapping is applied to the imported file, and the enriched data can be reviewed.   At this stage, data has also been cleansed – for example worker names are trimmed of leading or trailing spaces and changed to uppercase.
Picture

​When the data is ready to be loaded to ERP Cloud, the Build buttons are used to generate zipped HDL files in the correct format.   APEX_ZIP is used to zip up the files.   Only selected records are included; this allows us to re-send a specific worker if an error is encountered during the import process.  
Picture
​The prepared HDL files are then accessed on the Download page.
Picture
​If we open up the zip file, we can see that the hard work of building the HDL file in the correct format has all been done.
Picture
​In this Use Case, the upload was performed manually from ERP Cloud using the Data Exchange screen below.   Note, this step could also have been automated in APEX.
Picture
Having uploaded the workers, we now need to check that they all imported successfully in order to load the supervisors.    That’s easy from the Data Exchange screen if all workers load, in a larger file, getting a quick list of kickouts can be challenging.
​
We’ll use another APEX page for this.    Clicking on the New Worker Extract button initiates a worker download from your ERP Cloud Environment.
Picture
​A background job uses the APEX built in packages APEX_WEB_SERVICE and APEX_CREDENTIAL to connect securely to BI Publisher running on ERP Cloud and run a report on workers.   Data is retrieved in XML format and parsed into our database using APEX_DATA_PARSER.
Picture
​Once loaded, the worker data can be reviewed and verified.
Picture
Back on the original Worker Upload page, the Build Supervisor HDL File button can build a second file adding each worker’s Line Manager.   The form checks first that the Line Manager exists in Oracle, based on the data that was downloaded using BI Publisher in the prior step.
 
APEX helped us reliably validate, map, and load 1000 workers from five different sources in under an hour on cutover weekend – and this process can be repeated on subsequent projects.
Real-World Example 2 – PO Change Orders
Another customer needed assistance doing a mass-update of 7,000 PO Line Schedules from 3-Way Match to 2-Way Match.   

​They would provide a list of the Purchase Orders, Lines and Schedules, and wanted us to create a process to create Change Orders using a Procurement Cloud SOAP web service.
​
We verified the changePurchaseOrder web service was operational and met customer requirements before documenting the inputs it needed for a successful update of single and multi-line orders.

    
​The input file would be a flat XLSX format, which we extended to allow update of up to three fields at the Line or Schedule levels.   As it turned out, the only attribute that the client wanted changing was MatchApprovalLevelCode, but this future-proofed us for any additional requirements or use on other projects.
Picture
​
​As before, we created an upload form in APEX to accept XLSX files using APEX_DATA_PARSER.
Picture

​Imported rows are reviewed, and when the Submit Change Process button is clicked, a background job is run to do the updates.  
Picture

The ERP Cloud Credentials used by the web services are held securely using APEX_CREDENTIAL.  APEX_WEB_SERVICE is used to call SOAP web services for each PO update, checking the PO status before and after the changes using further web service calls.
Picture

​Any issues with updates (for example because a PO already has a change order in process) are reported back to APEX, along with the PO status before and after the change.
Picture

​The standard APEX Interactive Grid download functionality is used to download the files rows and the update messages, providing a status report and Audit Trail.
 
Following a dry-run in a TEST environment first, the mass-update of 7,000 PO schedules worked successfully in PROD in a couple of hours.
Additional Use Cases
Over the last 5 years of working with ERP Cloud, we have encountered many reproducible use-cases that would benefit from the tools available to us in APEX.
​
We are building these into the TurboLoad module of JMJ ERP Cloud Express.
Picture

Further examples include:
​
  • ESS Job Hold and Release.   Oracle asks you to put ESS and BI jobs on hold before quarterly updates; if you have hundreds of jobs running this can be time-consuming without automation.   Upload a simple spreadsheet with a list of the ESS Job Ids to quickly apply the holds and release them once the environment is released back to you by Oracle.
  • Supplier Inactivation.   Do you have hundreds of supplier sites that are no longer used or required?   A mass-update can be used to quickly apply an inactive date to these records to clean up your supplier master.
  • GL Conversion.   The FBDi format is recommended for high-volume uploads but doesn’t pre-validate GL Code Combinations or check that other attributes of your conversion journals are correct before Import Journals is run.   This APEX page allows you to convert your GL data from a simple Excel file, check account segment values and cross-validation rules and journal totals before mapping into FBDi format and automating Import Journals.
  • Password Resets.   We don’t recommend mass password resets using a single password post-clone as this represents a security risk, however it can be necessary to mass-update multiple user passwords from time to time for UAT or Training purposes.   This can be done from a spreadsheet using Oracle’s SCIM (System for Cross-domain Identity Management) web services.
  • Project Status Changes.   If you have many open projects in PPM and want to change their statuses, e.g. from Contracted to Closed, this tool will help you.   Upload a spreadsheet with the Project Numbers and their desired status, and APEX will quickly apply the updates.
Conclusion
​A SaaS based environment with all the features of APEX and the Oracle Database is the ideal tool for Oracle ERP Cloud data conversions and integrations.
 
Streamline your ERP Cloud data loads using APEX built-in web service calls, Excel file parsing and low code UI development allied with the power of SQL & PL/SQL.    
 
Your conversion process can be faster, repeatable, and generate a full audit trail and upload reporting.    Isn’t that we all want when loading data to Production?
0 Comments

Oracle ERP Conversions Made Easy with APEX (Part 2)

5/13/2020

0 Comments

 
Picture
In our prior post, Oracle ERP Conversions Made Easy with APEX Part 1, we showed you how Oracle APEX can be used to streamline and improve data conversion to Oracle EBS.   
 
This blog entry continues the theme, showing you how the same principles and techniques can be applied in the Cloud world when converting data to ERP Cloud.
Part 2 will walk through the tools available in ERP Cloud and highlights some of the challenges in using them.
 
Part 3 demonstrates how APEX is the perfect fit to make your conversions simpler, faster, and repeatable. 
 
We’ll share some real-world case studies to show you what can be achieved, including a preview of the pre-built conversions built into our latest APEX cloud software -  JMJ ERP Cloud Express.

​ERP Cloud Conversion Tools

​Let’s start by reviewing the tools available to us in ERP Cloud for getting data in and out of the database.   There’s a long list (and a lot of acronyms to work with).  
 
Some are spreadsheet-driven and geared towards the end-user, others are more technical in nature and require specific software to drive them.   

ADF Data Integrator - ADFDi

​ADF Data Integrator (ADFDi) spreadsheets are similar to the GLDi spreadsheets used with Oracle EBS – preconfigured and heavily coded spreadsheets that you populate with your data.   Common uses are for end-users to load GL journals or AP invoices. 
Picture
ADFDi for importing GL Journals
Picture
ADFDi for importing AP Invoices
​These have some advantages for loading data: they pre-validate data before loading, and typically have built in dropdowns allowing users to review and select valid values for fields such as GL Accounts or Supplier Names.    
 
However,  they struggle with larger data volumes, don’t work on MacOS, can be fiddly to use (for example the undo button doesn’t work and you often have to start again if you make a mistake), and Excel template downloads have to be periodically refreshed when Oracle carries out quarterly upgrades.    
 
They also require a local install of the ADFDi framework into Excel - that can be tricky to deliver in some corporate environments and again requires re-installation with each quarterly release.   Finally, they’re only available for a limited list of business objects.   You won’t be loading customers into ERP Cloud using ADFDi.

File-Based Data Integrator - FBDi

​File Based Data Integrator (FBDi) spreadsheets are downloaded from Oracle’s online cloud documentation pages and consist of simple worksheets with one or more tabs, for example Assets with Asset Distributions, or Customers with Customer Contacts.    
Picture
Data is entered into the spreadsheet tabs manually but there is no live validation; comments in each column header give guidance on how to fill each field but as there are many fields in each sheet, this can be challenging for an end user to work with.    Most fields are optional; others are conditionally required which can cause confusion.
 
Once data has been entered, an Excel macro is kicked off using a button on the first tab to build a zip file.   This contains the data in CSV format.   From a technical perspective, you can create the CSV file directly and zip it up yourself, but there are a number of things to look out for - the order of fields in the spreadsheet doesn’t always match the CSV generated by the macro, and the CSV filenames must remain consistent to the Oracle-generated names for some of the loads.
 
Having got this far, and assuming you have executed due diligence to make sure values are valid (for example by using VLOOKUP), the upload process consists of running Enterprise Scheduler Service (ESS) jobs in Oracle to transfer and load the file.   The number of jobs and submission process may vary depending on the business objects, but a typical flow is to:

  • Run Load Interface File for Import to stage your file into an Open Interface table, e.g. GL_INTERFACE.    Prior to job submission, your source file is staged onto Oracle’s Universal Content Management server (UCM) and the job kicks off subprocesses to transfer this to the Oracle database and run SQL*Loader.
  • Optionally, run a second Import process to validate and import the data into the final destination tables, e.g. 
    • Journal Import (GL Journals)
    • Import Costs (PPM Expenditure Items)
    • Import Invoices (AP Invoices)
 
The second process usually provides an output report listing data loaded along with any errors.    This should always be reviewed, and issues may require resolution.   This can mean going back to the beginning and checking your source spreadsheet, making corrections and repeating the upload process.  
 
End to end, this process can be tricky and time consuming to get right.   For the best results, a user has to be rigorous in Excel to ensure the data is correct before loading.
 
The list of FBDi templates is found here:
 
https://docs.oracle.com/en/cloud/saas/financials/20b/oefbf/index.html
​

FBDi Automation Web Services

FBDi lends itself to easy automation via Oracle's SOAP and REST web services.   Clearly this is not something you’d expect an end-user to do.    If you have Oracle APEX or some other form of middleware, you can use the ERPIntegrationService web service to:
  • Load files to the Universal Content Management Server (UCM)
  • Submit ESS Jobs (e.g. Load Interface File for Import and Import Journals)
  • Check ESS Job Statuses
  • Download and verify logs and output reports
 
It can be cumbersome to write a full integration flow, calling each web service in turn and checking statuses, and Oracle helps us here by providing a bulk import web service; a single web service call transfers the file and runs all the required import processes in the correct order.   The user can opt to receive an email notification when the job is complete.
 
The ERPIntegrationService web services are documented here:
  • SOAP 
  • REST 

Business Object SOAP Web Services

Oracle provides a series of SOAP web services linked to business objects such as Suppliers and Purchase Orders.   These are documented online for each release, so for 20B that is located here. 
 
The documentation can be cryptic and working out which fields are required/not required or are updateable can take a bit of time and experimentation.    We suggest using the “retrieve” or “get” method to inspect the fields on an existing transaction - this is often a good guide as to what is required when creating a new transaction.  
Sample PO Change Order Payload

    
​Bear in mind that most of the SOAP web services are older and slowly being replaced by REST.   Oracle has been known to retire or de-support SOAP web services.  The Supplier and Person web services have both been deprecated or replaced by REST or HCM Data Loader (HDL).
 
Some of these web services will only accept one transaction at a time; others allow multiple transactions to be sent in a single call, with errors returned for each record.

Business Object REST Web Services

​The move to REST web services is ongoing, and with each recent quarterly release, new REST web services have been made available, or existing web services have been extended.    This is good news, and if a REST service is available as well as a SOAP web service, REST is the option to choose.    
 
It is wise to check that the REST web service is appropriate for your use case before you start; many still only allow a GET operation, with PUT, POST or PATCH hopefully available in future releases.    
 
We recommend verifying that a REST service includes all the fields you’d like to populate or update; the HCM employee service did not include the Default Expense Account field when it was first introduced, which is not good for Financials implementations.    That omission has now been rectified but the web service is still marked as “Controlled Release", meaning that an SR must be raised to enable it, and your use case must be approved by Oracle.
Sample REST Call to add a team member to a Project

    
Business Object REST web services are documented here.

HCM Data Loader (HDL)

HCM Cloud provides a tool called HCM Data Loader (HDL), accessed using the Data Exchange usage.   This covers a variety of HCM business objects, and is used to load and maintain users, user roles, workers and key setup elements such as Jobs and Locations.
 
HDL expects text files in pipe-separated format (PSV).  The sample below for loading jobs shows the structure, with METADATA lines describing the data fields being loaded, followed by one or more data rows.
Sample Job Load using HDL

    
​For more complex multiple-level entities such as Workers, the situation is more complex. Multiple metadata rows are combined in the same file, for example,  Names, Emails and Assignments can be included along with the Person Number.
Sample Employee load using HDL

    
​Date-tracking in the HCM modules is a key area to watch for and ensure is coded correctly; a change in Start Date is possible but can create problems if noticed several years later and a worker has a complex employment history (e.g. multiple hires, re-hires across multiple legal entities with OTL timesheets).
 
It is important to pay attention to SourceKeys – they’re crucial for updates to records after they have first been loaded. 
 
Tracking/tracing issues can be problematic without writing custom reports; the Data Exchange allows you to drill through each file to see error rows but isn’t great at summarizing error messages across a large file.   The error messages can be cryptic and require some experience to read.

BI Publisher

BI Publisher reporting allows us to run custom SQL statements against the ERP Cloud Database.   We typically need to verify setup values, report on import errors and confirm data imported successfully during our integrations and conversions, and so the easiest way is to call BI Publisher using web services.   These are documented here.
 
This works well for smaller files.  For larger data sets, you need to be aware that a connection or download may time out after a couple of minutes.   You have several approaches available to you to deal with this.  
  • One option is to run the SQL in batches, returning data using multiple sequential web service calls and aggregating it on arrival.
  • Alternatively, you can specify that a report’s  output be delivered to Oracle’s Universal Content Management server (UCM).   Once the report is complete, UCM web services can be invoked to download the zip file.   
 
The UCM web services are documented in this My Oracle Support note:
  • How To Get Files from UCM for Fusion Applications using the GenericSoapPort Web Service (Doc ID 2384418.1)

BI Cloud Connector (BICC)

​BI Cloud Connector allows you to run a Cloud Data Extract to UCM or to Oracle Cloud Storage.  
 
These are extracts that can be scheduled as required to export data for various Business Objects.   The resulting files can be accessed using UCM web services or links to Oracle Cloud Storage.
Picture
Picture
​You can be notified once an extract is complete.   The zip files on UCM contain the data in CSV format along with a Manifest and JSON file.    The Manifest file shows the content files and view version:
Manifest

    
​The JSON file returns the query performance data.
JSON

    
Documentation for 19A is found here.

Cloud Conversion Challenges

Using the ERP Cloud tools to execute data conversions and integrations without some form of automation can lead to problems.  
  • Manual Excel manipulation is slow and subject to errors.   Users can make mistakes, add trailing spaces, save files with inconsistent date formats and leave mandatory fields blank.   Formulas can be out of sync and Excel lookups can fail.   Simply opening and closing a CSV file can corrupt the formats of dates and numbers within the file.
  • Excel files can be hard to secure and protect.   Excel passwords can be cracked online, and files are often stored un-encrypted on local computers.    Data security is a wider topic than this blog – however,  always remember to treat sensitive data (e.g. PII personal information) with due respect and caution.
  • Excel files saved locally can be hard to work on collaboratively.   Sharepoint/Box/DropBox and other cloud tools provide an alternative but are often left unused and spreadsheets fly back and forth via email.
  • How do I quickly and reliably map and default extra fields into my data?   I can use VLOOKUP (or INDEX/MATCH) in Excel – but how do I know the values are right and that all data has looked up correctly?
  • How do I know if the data is correct and complete before I load it?   How do I check it is still correct when I send it to a different environment that may have different setup values?
  • How do I check data has been fully imported once it has been sent to Oracle?   How can I provide a reconciliation report back to my users comparing what they sent for load versus what made it into ERP Cloud?
  • I have no middleware available - how can I can automate my web service calls?
  • Why does it take so long to run the data into Oracle?  Why are there so many processes to run?  How can I reduce the time taken for each conversion cycle and reduce the potential for user error?

To Be Continued ...

​In Part 2, we have reviewed the tools available to load data into ERP Cloud, and discussed some of the challenges.   
 
In Part 3, we will show how APEX can vastly simplify the data migration process, and we’ll share some real-world case studies to show you what can be achieved.
 
This will include a preview of the pre-built conversions and tools built into our latest APEX cloud software -  JMJ ERP Cloud Express.
Picture
Picture

Author

Matt Paine , Co-Founder JMJ Cloud
0 Comments

Oracle ERP Conversions Made Easy with APEX (Part 1)

5/7/2020

0 Comments

 
Oracle ERP Conversions Made Easy
​This is the first in a two-part post on how to use APEX to facilitate data conversions (and data uploads) into Oracle e-Business Suite (EBS) and Oracle ERP Cloud.

ERP systems are data hungry beasts that require constant feeding. The task of converting and loading data using existing tools are both time consuming and error prone.

In these posts we will cover the benefits of using APEX for data upload and describe several examples where APEX has been used to make the task of bulk loading data into EBS and ERP Cloud more user friendly and more accurate than traditional methods.

In this first post, we will focus on Oracle e-Business Suite (EBS).

Background

You may think that conversions are only required during the initial implementation of an ERP. Organizations carrying out acquisitions, however, can mean data conversions are being conducted several times a year. There are also many scenarios where companies need to regularly upload data from other systems into their ERP. The standard tools for uploading data into EBS are not user-friendly and require a significant investment in time from IT (or consultants). If the upload and validation of data could be put directly in the hands of end users, it would reduce costs and increase data accuracy.

Legacy Tools

In the olden days, the business and IT had to perform the following complicated dance in order to load data into EBS:
  1. IT receives a copy of the data file from the end user.
  2. IT copies the file to the application file server.
  3. IT runs a SQL*Loader script to load the table to a custom staging table.
  4. IT runs a concurrent program to validate the staged data and sends a report to the end user to review.
  5. The end user corrects errors in the data and sends a new file to IT to repeat steps 1-4.
  6. Once all data has been validated, IT will run another concurrent program to move the data to the Open Interface table and launch the Open Interface concurrent program.
  7. IT sends the report output from the Open Interface to the end user.
  8. End user reviews results and potentially sends a revised file with fixes to any failed records.

Requirements for a Better Solution

Any solution we provide using APEX must meet the following criteria:
  • Must be user friendly so that occasional business users can easily preform uploads.
  • Must clearly show end users what data is valid and what is invalid.
  • Must utilize standard EBS mechanisms to perform the final data load (e.g. EBS Open Interface Tables, EBS Public APIs etc.).
  • Must prevent duplicate and or bad data from being loaded into the system.

A Better Solution Using APEX

There are two supported approaches to loading data into EBS. Using the ‘Open Interface Tables’ or using ‘Public EBS PL/SQL APIs’. Using these methods ensures that you do not violate your support agreement and ensures EBS has performed final validations on your data before loading it into its base tables. The diagram below shows the pattern we use for conversions/uploads that use EBS Open Interface tables to perform the final upload of data into EBS.
ERP Conversions with APEX Design Pattern
As you can see from the design pattern above, using APEX, we have reduced the number of steps from 8 to 4 and put the entire process in the hands of the end user.

Let’s drill into these steps in a little more detail.
 
1. Data Parsing and Initial Validation
The File item type in APEX allows users to drag and drop (or select a file) from their file system onto an APEX page. You can then process that file as a BLOB within PL/SQL. Once you have the BLOB you can hand over the heavy lifting of parsing the file to APEX_DATA_PARSER. APEX_DATA_PARSER (available since APEX 19.1) does the heavy lifting of parsing the BLOB and giving you the results so you can consume them in a SELECT statement. We blogged in more details about APEX_DATA_PARSER here.

​Example SQL Parsing a BLOB:

    
Now that the data is parsed you can loop through it performing your own pre-validations. It is a good idea to perform pre-validations on user-provided data so that you can intercept 95% of errors before records make it to the EBS Open Interface tables. It is much harder to clear records and re-try failed records once they are in the Open Interface tables. As the records are validated, we add them to an APEX Collection. Collections are possibly the most useful tool in your APEX toolbox.  They provide a temporary storage area for data that is contained within your specific APEX session.
 
2. Review Validation Results
Now that we have the valid and invalid records in a collection, we can present these to the user in an Interactive Report (or Grid). The user can review invalid records, correct data in their source file, re-upload and re-iterate until all records pass validation. The ability for the end user to iterate is a key differentiator when using this method over traditional approaches.
 
3. Initiate EBS Open Interface
Once the user is happy with the results of validation, they click a button to insert the valid records into the appropriate Open Interface tables and then the Open Interface Concurrent program is launched using a PL/SQL API.
 
4. Review Final Results
Once the concurrent program is complete, the user can review any failed records in the interface table and see errors from the open interface table’s associated error table. For example, the item interface table mtl_system_items_interface populates errors in the table mtl_interface_errors. This is achieved without the user having to leave APEX by providing a 2nd APEX page to review the content of these tables.

Video Clip

The following video demonstrates a user uploading a file, submitting it for validation, reviewing the results of validation and then submitting the valid records to the interface table. Finally, it shows the interface completed without errors – there were no errors in the table.

Considerations

Here are some considerations when using APEX to load data:
  • APEX Collections
    • APEX Collections offer a great deal of flexibility. Because collections are widely used by APEX developers you need to ensure you don’t adversely impact performance of other code that uses collections. You can do this by making sure you delete the collection once you are done with it.  This is especially important when loading large volumes of data.
  • Unique Batch Identifier
    • Once you move validated records into the EBS Open Interface table, you need a way to uniquely identify your specific batch of records. Most EBS interfaces have a batch identifier column that you can use for this. For example, the item interface table mtl_system_items_interface has a column batch_id that you can use for this purpose.
  • Viewing Results of the EBS Open Interface in APEX
    • Once the Open Interface has run you should provide another APEX page to allow users to see records in the interface table and its respective error table, e.g., the item interface table mtl_system_items_interface and its error table mtl_interface_errors. The unique batch identifier described previously helps you query records for a specific interface run.
  • Using EBS Public APIs
    • Many EBS entities provide a public PL/SQL API as well as an Open Interface. Using a public API gives you immediate feedback (success or error) on whether a record was successfully created in EBS. This does have obvious benefits, but you should use caution when loading large numbers of records. When calling the public API for each record, your user will have to wait until all records are created before they get control back. There is also a risk that the web server will timeout. However, if you submit a concurrent program to load via the Open Interface, you user can go about their work while the load occurs in the background. I typically try to use public APIs for loads of up to 10 thousand records and the Open Interface for larger loads.
  • Reduced Custom Objects
    • Using APEX, APEX_DATA_PARSER and APEX Collections reduces the number of custom objects you need to deploy to EBS to support your solution.
    • APEX_DATA_PARSER removes the need for SQL*Loader scripts to load data files into a staging table.
    • APEX Collections allow us to store data temporarily while it is being validated, thus, removing the need to create or maintain a set of staging tables.
  • More Parsing Options
    • APEX_DATA_PARSER allows you to natively parse xlsx, csv, json and xml files whereas SQL*Loader only handles delimited and fixed format files.
  • High Volume Uploads
    • If you are uploading hundreds of thousands (or even millions of records), then SQL*Loader is probably still the way to go. SQL*Loader offers features such as direct path upload that make writing data to staging tables orders of magnitude faster than standard INSERT statements.

Example Conversions & Uploads

​Here are just a few examples of EBS conversions and data loads we have developed for our customers using the above approach. Hopefully they will give you some ideas.
 
Inventory Item Uploads
Load new inventory items into EBS from an Excel Spreadsheet.
 
Inventory Item Category Assignments
Load item category assignments from an Excel Spreadsheet.
 
Inventory Item OnHand and Cost Upload
When you need to upload items you typically also need to load starting inventory on-hand balances as well as average or standard costs.
 
Mass Blanket Agreement Price Updates
If your organization has hundreds or even thousands of blanket purchase order agreements, applying price updates across these blankets can be a challenge. In this interface we allowed end users to build an Excel file of items with their new prices and apply them to blanket purchase orders using the Purchasing Open Interface.
 
Import WIP Labor Hours
If you use EBS for managing Work in Process (WIP) jobs, you need to capture labor costs to accurately cost these jobs. In this interface users uploaded csv extracts from their time and attendance system into an APEX page which then uploaded the data into EBS using the WIP Cost Transactions Interface.
 
Import External Payroll Data
In this instance the customer used Oracle Payroll but needed to interface additional payroll data from several 3rd party payroll systems. End users export payroll reports from the external payroll systems and then drag and drop these files into an APEX page so they could then be uploaded into Oracle Payroll. This gave the customer the ability to view payroll data across the entire organization.

Conclusion

​You may still be using EBS, but by utilizing APEX, you can introduce significant process efficiencies and UI modernization initiatives to delight your users.
 
Utilizing APEX along with EBS allows you to introduce process efficiencies and modernize the UI experience to the delight of your users.

Author

Jon Dixon, Co-Founder JMJ Cloud

0 Comments

    RSS Feed

    Popular Posts

    - APEX Dog Food
    - Cloud ERP & APEX Mashup
    - Modernizing EBS with APEX
    - Easy APEX_WEB_SERVICE
    - Running APEX in RDS
    - ORDS What & Why?

    Categories

    All
    APEX
    AWS
    Fusion Cloud ERP
    INTEGRATION
    MS GRAPH
    OCI
    ORDS
    PaaS
    RAD
    REST
    SOAP

    Archives

    October 2021
    February 2021
    January 2021
    October 2020
    September 2020
    June 2020
    May 2020
    April 2020
    February 2020
    January 2020
    October 2019
    September 2019
    August 2019
    July 2019
    June 2019
    April 2019
    March 2019
    February 2019
    January 2019
    December 2018
    November 2018
    October 2018
    September 2018
    August 2018
    July 2018
    June 2018
    May 2018
    April 2018
    March 2018
    February 2018
    January 2018
    September 2017
    August 2017
    July 2017
    June 2017
    February 2017
    January 2017
    December 2016
    November 2016
    October 2016
    September 2016
    August 2016
    July 2016

Company

About
Contact
Blog
  • Home
  • Projects
  • Blog
  • About Us
  • Contact Us