Quantcast
Channel: Infosys-Oracle Blog
Viewing all 561 articles
Browse latest View live

Oracle ERP Cloud vs On-Prem: Part 1 - Introduction

$
0
0

1.  Introduction

This multi-series blog compares the cost, agility, and talents for implementing Oracle ERP's (Cloud and On-Premise) for small, medium, and large-size businesses. 

Issue Statement

Infosys is an Indian multinational corporation that provides business consulting, information technology and outsourcing services to clients across various industries all over the globe.  The Oracle practice at Infosys produces value for their clients through consulting, implementations, roll-outs, and upgrades to application support and maintenance (Infosys).  A strategic decision many Infosys' client IT and financial executives are starting to face more frequently is whether to implement Oracle ERP application on-premise or on the cloud.  This decision is no longer technical or functional, it is rather more financial (Aberdeen).  The greatest challenge faced by client executives making these decisions is that there is not sufficient data available to support one type of Oracle Financial ERP implementation over another.  The purpose of this research is to evaluate the total cost of ownership (TCO) and return on investment (ROI) of using a cloud-based Oracle enterprise resource planning (ERP) system as opposed to an on-premise Oracle ERP platform for Infosys' clients seeking to implement Oracle financial ERP for the first time and those which are on older versions of the application.  It is intended for Infosys sales and client IT and Finance managers to help decide offering which provides the most value to the client. 

Analysis overview

When analyzing financial business processes, there are three drivers finance and IT executives must study: economics, agility, and talent.  Oracle vaunts the economic benefit of their cloud services to their customers through better cash flows by means of lower up-front infrastructure costs and by charging the client a licensing cost for only the desired features.  Due to the nature of the cloud offerings, clients will always have access to the latest software and hardware technology and functionality, high speed to market, and more standardization and simplification.  (Oracle) Working on the latest technology with the latest features and functionality also helps firms retain and attract better talent which can result in greater effectiveness and efficiency and better overall adoption. 

Recommendation Overview

For most clients, the challenges of cloud ERP's outweigh the benefits resulting in only 20% penetration in Oracle Cloud ERP adoption.  Cloud implementations are not recommended for all clients; it is more suitable for those with limited investment in IT infrastructure and who have a user base which varies throughout the year.  (Joshi) Clients which already have invested in infrastructure and resources to protect and maintain these investments may not be interested in jumping on the cloud, however, as the infrastructure begins to depreciate and become outdated, some clients would find cost savings in retiring the infrastructure assets and implementing a cloud iteration of their Oracle ERP financial application (McPherson). 

Planning and Implementation Overview

Regardless of which platform clients decide to implement, it is important to note that both platforms present unique challenges.  Through a thorough evaluation of the clients' business processes, value configurations of on-premise and cloud implementations will be used to determine the how the discrete activities, drivers, and linkages impact clients' margins.  This data will be used to plan a course of action and help clients plan and implement an iteration of Oracle ERP financial applications, on-premise, cloud, or hybrid, based on the output of internal analysis. 

Continue to Part 2: Oracle ERP Cloud vs On-Prem: Part 2 - Analysis


Oracle ERP Cloud vs On-Prem: Part 2 - Analysis

$
0
0

Part 2:  Analysis

Part 2 on the series of Oracle ERP Cloud vs. On-Premise analyzes the various Cloud offerings and On-Premise to develop a better understanding of available options.  Purpose of this analysis is to gain a deep understanding of different business models and allow corporations to determine which implementation is best for them. 


Internal Analysis

Figure 2.1.png














Figure 2.1:  Value Chain for a typical manufacturing/operations firm

 

The value configurations of all corporations include finance as one of the activities.  (Figure 2.1) When analyzing the impact of Oracle's Financial ERP applications, specifically, the discrete activities can be broken down into business analysis, development, implementation, maintenance, and support.  (Figure 2.2) Drivers are structural factors derived from the corporations' previous activities and investments; these include economics, agility, and talent.  Economics determine the cost impacts of various Oracle ERP offerings to a corporation's bottom line.  Agility measures the security, control, and customization aspect of an implementation.  Lastly, a highly-skilled talent pool is required to carry out implementation of any type (McPherson).  The total value system of the Oracle ERP Financial application includes the Procure to Pay (P2P), Acquire to Retire (A2R), and Order to Cash (O2C) financial sub-processes.  These processes are closely integrated within the system and provide accounting data to the Record to Report (R2R) processes which is then used to develop reports and provide financial information to reporting agencies, sponsors, and key stakeholders (Figure 2.2).  

Figure 2.2.png














Figure 2.2:  Value chain for Oracle ERP Financial applications

Offerings

Cloud Analysis

Cloud Software-as-a-Service (SaaS) platforms provide peace of mind to the clients by means of predictable costs of the ERP system through a subscription without requiring any investments in infrastructure and hardware.  The ERP cloud application is hosted on Oracle's infrastructure and can be privately hosted for a single client or the resources of this infrastructure may be shared between multiple clients in a public cloud environment (Stoecklein).  Clients are not required to keep up with security standards for the infrastructure as the vendor is responsible for data security.  The implementation costs relatively low compared to an on-premise ERP system.  These factors may present an incentive for some corporations looking for a quick implementation at low up-front costs, but these costs can add up overtime and exceed an on-premise implementation. 

On premise analysis

While a traditional on-premise ERP installation has obvious drawbacks with investments in infrastructure, it does add value to financial processes by the ability for clients to perform highly complex processes through customizations since the client would own the Oracle ERP software.  Moreover, some organizations which are not adept at practicing the latest data security protocols would need to integrate data-security and disaster recovery for their financial ERP systems into their existing collection of applications (Stoecklein).  Larger corporations which already have in-house data-security and infrastructure for other applications may not find a cost benefit in implementing a cloud application.  Since vendors incorporate costs for data security, disaster recovery, and other practices into the cloud licensing costs, clients with on-site security teams can take advantage of the lower licensing costs by implementing on-premise ERP applications.

Hybrid analysis

Lastly, for clients which require an ERP installation to conduct financial business processes but neither implementation fits directly with their current strategy, Oracle provides an option for a hybrid solution which combines flexibility of on-premise ERP with the outsourced model of Oracle cloud ERP SaaS.  It provides the ability for clients to move instance in-house at a later date, or vice versa.  This offering has become more common as SaaS is gaining popularity and clients become wary of the hype in cloud implementations.  This model is also suitable for companies which haven't been able to make up their mind about which implementation to pursue with or if their desired choice does not fit into the value model. 

Cost

The choice between the different Oracle Financial ERP offerings boils down to one factor between the IT and Finance Executives:  margin.  An Oracle Financial ERP application plays an integral role in a company to help manage and integrate important aspects of financial business functions such as paying vendors, placing orders with customers and receiving payments, tracking assets, and generating accounting and financial statements.  A financial ERP system presents opportunities for corporations to improve efficiency, increase revenues, and control costs.  (Hedges) In the past, Oracle ERP implementations were accessible only to large enterprises due to implementation, support, and licensing costs.  While an Oracle Cloud Financial ERP implementation alleviates the cost factor, it can prove to be costlier than an on-premise implementation over the course of time for clients with large footprint if not thoroughly analyzed.  In order to make a fair assimilation of application costs, they must be evaluated over long-term.  The general rule of thumb is for companies to expect total cost to be between 4 to 6% of the annual revenues of the company.  (Erik) Some factors which influence implementation costs include size of user base and divisions, level of customization, required resources.  These include costs for ERP software, database management system, infrastructure, employees and consultants.  Below is a cost analysis breakdown for small firms with revenues under $250 Million, medium firms with revenues ranging from $250 Million to $1 Billion, and large enterprises with revenues greater than $1 Billion. 

Size

%

Company Size

Users

Modules

Small

55%

< $250 Million

100

10

Medium

30%

> $250 Million
< $1 Billion

500

11

Large

15%

> $1 Billion

2500

13

Table 3.3:  Company Size

Small Businesses Costs

Smaller firms with revenues less than $250 million represent approximately 60% of the clients.  These corporations have an average of 100 users in the Financial ERP application and do not have hefty investments in infrastructure and full-time employees to manage the infrastructure and manage data-security in-house.  Instead they rely on offshore contractors and 3rd party vendors to manage and maintain their applications, making them ideal candidates for Oracle Cloud ERP implementations.  (Kimberling) Prior to the cloud, smaller companies would opt for Tier II vendors such as Microsoft Dynamics, Lawson, NetSuite, among others.  With the advent of cloud computing, Oracle is able to offer an advanced ERP system at similar costs by allowing multiple clients to share the same resources and infrastructure.  There are still, however, some scenarios in which small businesses would require an on-premise implementation.  If a corporation is involved a specialized financial environment which requires a highly customized code or highly technical expertise, a cloud implementation would not fully support their business processes, however, the customizations may not provide sufficient value compared to the high costs of on-premise ERP implementation for small businesses.  A scenario in which on-premise would be required is when there are strong legislative requirements on how the data is stored and secured.  This generally applies to corporations which focus mainly on government contracts, requiring heavy industry data security and data stored in a secured firewall to comply with regulatory guidelines.  Due to the lower number of users and hence processing power, all aspects of the ERP system (application, database, and user interface) can be integrated into one physical box, also known as 1-tier architecture. 

Medium Businesses Costs

Medium businesses represent roughly 30% of the clients which range from over $250 million to less than $1 billion in revenues.  Performing cost analysis is most critical and complex for medium size businesses; the cost benefit of an Oracle Cloud Financial ERP is guaranteed in a small-business environment whereas it is not as straightforward in a medium business environment.  If not thoroughly analyzed, small businesses risk driving losses either through ineffective use of users' time on a non-customizable cloud platform or through spending more on an on-premise ERP which does not provide any distinct advantage over the cloud.  To evaluate the costs, medium size businesses must consider costs for software licensing, implementation and customization, infrastructure, IT personnel, maintenance, and training for both cloud and on-premise Oracle Financial ERP implementations.  Moreover, due to the similar outcome of costs, mid-size businesses also have the option for hybrid implementations (Lippincott and Wettemann).  It is possible that the mid-size corporation does not require a highly customized implementation, but as the company continues to expand into several different countries, hybrid would be a plausible solution.  Otherwise if cost of implementing cloud and installing an on-premise ERP are the same, cloud will allow for more continuity and can be recommended over on-premise. 

Large Business Cost Analysis

Lastly, large businesses represent the remining 15% of the clients which earn greater than $1 Billion in revenues each year and have a user base greater than 2500.  All large corporations are multi-national enterprises which conduct business over dozen countries.  This adds complexity to financial processing, requiring businesses to have highly customized processes varying by country, department, legal regulations, and user roles.  Implementations in such environments can be continuous and would not allow corporations to keep up with testing and training due to upgrades based on vendor selected schedules.   When strictly considering between SaaS Oracle ERP and on-premise for large enterprises, the decision for on-premise is clear.  

Agility

Many small and medium-sized corporations are involved in a cycle of ERP replacement driven by global economic conditions and increasingly complex financial business processes.  This paves an ideal path for Oracle to market their Cloud Financial ERP application due to low implementation timelines and up-front costs.  However, the adoption has been less than stellar.  Clients are not convinced with the technical and functional limitations of Oracle's Cloud ERP application, while others worry about security and control. 

Security

A cloud environment can either be private or public.  In a private cloud environment, the application is hosted on vendor's site in a segregated environment for an individual client.  Whereas, in a public cloud environment, multiple clients share same resources and infrastructure.  Although users from one client cannot access data for another client, it poses higher security risks since the hackers have a greater advantage of obtaining vital financial data financial process information from multiple companies with one security breach (Erik).   This has led to security as the highest-ranking concern with clients.  The American National Standards Institute (ANSI) rates Oracle data at a Tier 4 rating - the highest rating.  Oracle suggests IT leaders to consider matching internal security and service levels standards of top quality data centers and consulting partners to avoid security leaks through users (Oracle). 

Control

The Oracle Financial ERP Cloud software is shared in a public cloud environment; hence, clients will be on the same software version and will have less control over when upgrade patches are installed.   Upgrades requires clients to validate system operability, perform end-to-end testing of business process being affected, and train the users on the differences in functionality.  Oracle releases three patches a year, one for each quarter leaving out the fourth quarter due to annual financial close processes.  All testing and maintenance needs to be performed by a deadline specified by Oracle (Oracle).  Private offerings enable companies to install upgrades on their own schedules, however midsize companies tend to fall behind on upgrades because of the time, money, and resources required.   Moreover, by revoking access to the database, cloud-based services, in general, are not as effective in solution development and problem solving.  To obtain data from the database, instead of writing a script to pull data themselves, developers and consultants are required to create service requests (SR's) with Oracle (Oracle).

Customizations

In most finance process scenarios, clients require customizations to extend the application's ability to adapt to their business needs.  On-premise Oracle implementations offer customizations through additional configurations and in-house extensions built on top of the core functionality, commonly known as RICEFW objects (Reports, Interfaces, Conversions, Enhancements, Forms, and Workflows).  In the on-premise environment, these customizations can be made efficiently through the combination of external resources (applications such as PL/SQL scripts, Oracle Application Framework, and workflows) and internal resources (developers and IT professionals).  Oracle Cloud ERP applications are still fairly new and the wide industry penetration is yet to be experienced.  As it stands, Oracle Cloud ERP's have significant restrictions on type of customizations which can be made on the vendor's SaaS platform (Lippincott and Wettemann).

Talent

The resource pool of ERP implementation, support, and developer consultants continues to grow as availability remains to be in short supply.  The technology and tools for on-premise applications have existed for decades with subtle changes, resulting in resources with broad skills in vast numbers to match required skills and knowledge.  Cloud is a fairly new concept and cloud provides advantages and presents new challenges in terms of talent.  Due to the recent introduction of cloud and popularity of the platform, cloud professionals are in high demand, however, not many job seekers have accumulated sufficient experience to perform at the same level.  Employees working on newer cloud platforms result in lower turnover but these employees have a tendency of getting poached by other employers with better salaries.  Alternatively, companies can also outsource demands to implementation partners such as Infosys (Lippincott and Wettemann).   


Continue to part 3:  Oracle ERP Cloud vs On-Prem: Part 3 - Recommendations

Oracle ERP Cloud vs On-Prem: Part 3 - Recommendations

$
0
0

3.  Recommendation

Part 3 in this series provides recommendations for which implementation small, medium, and large corporations should implement based on user base, costs, and existing IT infrastructure investments.


Table 3.4:  Cloud vs On-Premise ERP Cost Comparison


Cloud

On-Premise

Company Size

Small

Medium

Large

Small

Medium

Large

 

User Base

100

500

2500

100

500

2500

 

Software Licensing

$3,000

$3,000

$3,000

$5,500

$2,500

$1,500

 

Customization

$500

$300

$100

$2,000

$1,000

$500

 

Implementation

$2,000

$1,500

$500

$3,000

$2,000

$500

 

Infrastructure

$1,500

$1,250

$1,000

$2,000

$1,000

$500

 

IT Personnel

$1,500

$750

$250

$2,000

$1,000

$750

 

Maintenance

$1,000

$750

$500

$2,000

$1,000

$750

 

Training

$3,000

$1,500

$750

$2,000

$1,000

$500

 

Cost Savings*

($250)

($150)

($50)

($750)

($500)

($250)

 


$1,225,000

$4,450,000

$15,125,000

$1,775,000

$4,500,000

$11,875,000

 










*Savings through automation and customization

 

As the implementation partner for various clients, the goal for Infosys consultants is to recommend and design a path of implementation for a product which produces most value for the client.  When considering the value chain model any organization, finance plays a critical strategic role in helping CEO's with setting objectives, making decisions, and planning for the future (figure 2.1).  Oracle ERP Financial installations allow corporations to improve efficiency, increase revenues, and control costs.  Cloud offers lower implementation costs for small businesses but the licensing costs can add up.  The break-even analysis demonstrates costs to be equal for user base between 500 and 2500 users depending on current infrastructure investments, customizations, and existing resources. 


Figure 3.1.png







Table 3.1:  Cloud ERP Internal Evaluation



Figure 3.2.png







Table 3.2:  On-premise ERP Internal Evaluation


Small Businesses

Prior to cloud platforms, ERP applications were out of reach for smaller businesses due to up-front infrastructure investments and high implementation costs.  Cloud is an emerging field which enables multiple smaller businesses to share the ERP application infrastructure hosted by the vendor resulting in lower costs and low turnover.  As Table 3.1 demonstrates, a cloud implementation provides a greater margin compared to on-premise when applied to a small user base.  As the user count grows, the cost of a cloud implementation grows in a linear fashion.  Smaller companies can take advantage of cloud implementations by avoiding heavy investments in infrastructure, IT personnel, and maintenance.  On the downside, cloud offerings are shared between multiple clients and clients lose control of customizations, maintenance schedules, and security.  This results in quarterly user training and testing since the application gets upgraded more frequently but the margin is still greater than an on-premise implementation.  

Medium Businesses

The strategy on which implementation to pursue for medium businesses depends on the firm's strategy.  Graph 3.5 shows the costs of 500 users in a medium sized corporation are almost identical for cloud and on-premise ERP implementations.  Corporations which do not rely heavily on customized businesses processes will also find value in using cloud applications.  Depending on the growth of the business, a hybrid ERP implementation might fit better into the client's ERP strategy since moving from cloud to on-premise or vice versa will not require the client to reimplement the financial suite of Oracle ERP applications.  A cloud ERP would be beneficial if the client does not foresee growth and firm's financial processes are not heavily customized.  An on-premise would be required for firms which deal primarily with government contracts or confidential financial information which requires heavy industry data security and data stored in a secured firewall to comply with regulatory guidelines. 

Large Businesses

When it comes to large enterprises with an enormous user base of multiple thousands of users, the decision is clear: due to the economies of scale and costs of licensing and implementing Oracle ERP's, the on-premise option will always result in a more economical and effective implementation (Graph 3.5).  Global enterprises generally have financial operations in dozen or more countries.  Since these implementations can take number of years to complete, the enterprise needs to have strict control over security, maintenance, and is customization ability due to the highly complex financial processes.  By doing so, the ERP application can be automated and shaped to fit the business needs, providing greater efficiency and margin, security, and control over the application.  A cloud implementation for large enterprises will result in a diminishing return due to the lack of customization and value-add to the financial processes.  Developments in these complex environments require top talent; the challenge for large businesses will be with retaining talent as employees seek greater challenges working with newer technologies while the corporation is stuck on a decades-old technology.  

 Figure 3.5.PNG














Graph 3.5:  Cost of Implementing Oracle ERP


Continue to Part 4:  Oracle ERP Cloud vs On-Prem: Part 4 - Implementation and Conclusion

Oracle ERP Cloud vs On-Prem: Part 4 - Implementation and Conclusion

$
0
0

4.  Implementation and Planning

The final in series briefly describes the implementation route once the organization decides on an ERP route. 


An ERP implementation project consists of discrete activities which break down the implementation process:  business analysis, development, implementation, and post-production support (Figure 2.2).  These primary activities are consistent with cloud, on-premise, and hybrid implementations but the timelines and budgets will vary.  A study conducted by Panorma Consulting concluded that over 50% of ERP implementation projects run over schedule and over budget (Panorma).  Due to the recent introduction of Cloud ERP applications and low client penetration, there is not sufficient data pertaining specifically to cloud implementations to determine how many are under budget and on time.  Hence, it is important for implementation partners like Infosys to develop a strategy which allows clients to segregate the various financial activities and build a strategy which incubates the implementation through a total value system (Figure 2.3).  Each of the financial processes (P2P, A2R, O2C, and R2R) consist of a primary activities, drivers, and linkages; the output of each process is generated by evaluating the impact of drivers (cost, agility, and talent) and linkages (people, processes, and technology) to determine the total cost of ownership, thus the return on investment of cloud and on-premise implementations (Figure 2.3). 

Figure 2.3:  Oracle Financials ERP Total Value System

Figure 2.3.1.png




















Figure 2.3.2.png





Conclusion

The decision to implement an ERP on the cloud or on-premise boils down to one thing:  margin.  It is vital for a corporation to find an ERP package which matches the company's needs; one which supports their businesses processes and offers value propositions to the business model.  The decision for companies with lower than 500 employees and greater than 2500 employees can be determined without undergoing a thorough evaluation process (Graph 3.1).  For medium businesses, however, the outcome of the chosen method of implementation is not as straight-forward.  Infosys' specialization in Oracle EBS along with a strategic partnership with Oracle allows clients to gain strategic advantage in evaluating the costs, agility, and resources to decide which model of Oracle ERP to implement. 

Robotic Process Automation - Capabilities Overview

$
0
0

 

Robotic Process Automation - Capabilities Overview

Understanding the Basics

 


 

 

Introduction

Every few years, IT / ITeS industry is seeing a new product or technology which bring exciting new UI features, capabilities for users to configure the application easily and a lot of new buzz words and concepts for technology enthusiasts to get accustomed of.

Robotic Process Automation and Artificial Intelligence are 2 such buzz words which have excited fast growing organizations and IT industry equally and are really catching up fast.

While for the organizations, it opens up new avenues to achieve higher operational efficiencies and cost reduction, eventually impacting the bottom line; for IT industry it opens up new horizons to increase the client base with new offerings and increase the technological footprint.

In this blog, we'll focus on the overview of Robotic Process Automation - the basic understanding of RPA, the types of RPA and what encourages the fast growing organizations to go for it.

 

What is RPA?

In this competitive consumer market, organizations face a perpetual challenge of moving swiftly while keeping the costs of operations low while increasing the consumer satisfaction level and service offering quality.

Though organizations are aware that to reduce the costs, they have to achieve higher operational efficiencies; however, there is a direct impact on bottom line if they hire more staff to achieve that. Increasing the working hours, paying overtimes to existing staff again dents the profits. To alleviate these challenges, organizations are finding their savior in 'Robots'.

In simple words, RPA tools (Robots) emulate the manual steps as done by users across and through applications from UI entries. RPA tools operate by mapping a rule-based workflow process which the "robot" can follow to the 'T'.

An important point to note here is that these Robots can be implemented agnostic of system or application. These Robots can be as simple as batch file upload automation to as advanced as a cognitive automation which has self-learning, variable format processing capabilities. Processes can be triggered manually or automatically to:

·         Populate data across different systems and modules within them

·         Run queries on a scheduled basis and perform data reconciliation

·         Generate and distribute reports

·         Auditing of large volumes of data

·         Trigger downstream activities and processes

Per proven studies conducted by various institutions and agencies, it has been identified that in large organizations, there is typically a scope of saving 20-40% of workload for employees with help of automation and imagine what levels organizations can achieve with employees having 20-40% more of their time to focus on value added tasks.

 

Types of Automation

As mentioned above, there is plethora of tasks which Robots can do starting from simple, mundane activities like data entry to super complex activities like generating a dynamic response to user query based on machine learning and cognitive abilities.

There are different stages or levels of Intelligent Automation:-

·         Digital Worker - As evident by the name, this is the primary or entry level of automation an organization can move ahead with and still achieve efficiency gains.  This is the typical Robotics Process Automation tool which can perform tasks like:

o   Data entry

o   Running functions in excel towards data validation

o   Triggering customized emails with preset content or standard templates

o   Data comparison

o   Setting up reminders

o   Batch processing and populating mapped fields

o   Queuing and Assignment

 

·         Digital Reader - This is a secondary level of automation alternatively referred as 'Cognitive Automation'. Robots at this stage can perform tasks involving:

o   Machine Learning

o   Pattern or Keyword based recognition which is evolving over time as Robot sees and identifies more patterns / keywords

o   Data processing across variable formats

o   Dynamic queue assignment based on patterns

o   Complex analysis based on continual learning 

 

·         Digital Talker - This is an automation offering which focuses on providing a more interactive experience. Alexa from Amazon, Google Home are very popular examples of this. These robots are also called 'ChatBots' as they perform somewhat similar tasks as the previous classification of Digital Reader'; however have additional text and voice capabilities and are more communication focused. Robots at this stage can perform additional tasks involving:

o   Predictive Analysis

o   Customer Servicing

o   Query Resolution based on pattern or keyword based recognition which is evolving over time as Robot sees and identifies more patterns / keywords

 

·         Digital Thinker - This is the advanced level of automation which is the classic Artificial Intelligence. Artificial Intelligence tools are somewhat comparable to humans in terms of intelligence and have their own IQ. Currently, the IQ of these AI tools is significantly lesser than that of humans. Per studies performed in 2016, IQ of Google's A.I. (47.28) is nearly two times that of Siri (23.94), however a six-year-old child beats both of them when it comes to smartness and thinking capability. An average person's IQ is in range of 85-114.

As the IQ of these applications or tools increase, to a certain point it'll be beneficial for people and once the IQ surpasses that of average human, then we all know what will happen - we all have seen sci-fi moviesJ.

Nonetheless, the Digital thinker can perform below activities in addition to the activities listed for previous categories:

o   Predictive Analysis based on cognitive learning and complex algorithm 

o   Complex Mathematical Analysis

 

 

RPA Benefits

 

RPA_Diagram_1.jpg

Conclusion

Organizations need to be smart enough to understand their IT Landscape, business process steps and identify the correct tasks which can be automated with help of Robots. Although other competitors might be at a higher level of automation, an organization needs to be realistic in its approach and move with Automation stages through a proper strategy and careful planning to reap the benefits of automation.

 

Reference

1.       https://www.cnbc.com/2017/10/02/google-ai-has-almost-twice-the-iq-of-siri-says-study.html

2.       https://en.wikipedia.org/wiki/Robotic_process_automation

OAC-Essbase Data Load & Dimension Build Using CLI

$
0
0

OAC-Essbase Data Load & Dimension Build Using CLI

 

Introduction

I am working with one of the Oracle EPM Cloud implementation projects which is focused on migrating on-premise Essbase applications to Oracle Analytics Cloud (OAC) Essbase cloud applications. The OAC Essbase provides Command Line Interface utilities which can be used for data load and dimension build in OAC Essbase applications. This document explains about how to use the utility for data load and dimension build in OAC.


Utilities

Command Line Utility - We can download the Command Line Tool from OAC Essbase instance to our local machine to perform the Essbase data load and dimension build tasks

 Utilities.png

Setting up CLI environment

  • Open the command prompt and change the directory to CLI home directory.
  • To use Command line interface, Java JDK 8 should be installed and the JAVA_HOME path should be set
  • Set the CLI Home and Java Home:

                                SET CLI_HOME=D:\CommandLineUtility

                                SET JAVA_HOME=C:\Program Files\Java\jdk1.8.0_161

 

Logging into OAC Essbase through CLI

Before performing Dimension build and data load activities, we need to be logged into OAC Essbase.

Logging into OAC using admin id:

D:\CommandLineUtility> esscs login -user TestAdmin -password ****** -url https://test.OAC.com/essbase

                user " TestAdmin " logged in with "service_administrator" role

                Oracle Analytics Cloud - Essbase version = 12.2.1.1.112, build = 211

 

Create Data Base local Connection

-                        The DB local connection can be created using CLI command createlocalconnection. It takes all the required JDBC connection details as arguments.

Command Syntax:

D:\CommandLineUtility>esscs createLocalConnection -name oraConn -connectionString jdbc:oracle:thin:@DevDW:1XXX/DevID -user DB_USER

                Connection already exists, it will be overwritten

                Enter Password:

                User credentials stored successfully



Essbase Dimension Build

  • Run the dim build command with stream option 
  • Database query is required either in the rules file or must be provided as argument for dimbuild. If not given in command, it is taken from the rules file.
  • The streaming API is used to push the result from database to cube.

Command Syntax:

D:\CommandLineUtility>esscs dimbuild -application TEST -db TEST -rule Acct.rul -stream -restructureOption ALL_DATA -connection oraConn

                Streaming to Essbase...

                Streamed 9 rows to cube

 

Essbase Data Load

  •  Run the Data load command with stream option 
  • Database query is required either in the rules file or must be provided as argument for data load. If not given in command, it is taken from   the rules file

Command Syntax:

D:\CommandLineUtility>esscs dataload [-v] -application TEST -db TEST -rule DataLoad.rul -stream -connection oraConn

                Streaming to Essbase...

                Streamed 10 rows to cube


Reference(s):

1.https://docs.oracle.com/en/cloud/paas/analytics-cloud/essug/command-line-interface-cli.html.

2.https://support.oracle.com/epmos/faces/DocumentDisplay?_afrLoop=442454943227210&parent=EXTERNAL_SEARCH&sourceId=HOWTO&id=2259032.1&_afrWindowMode=0&_adf.ctrl-state=aphtjq7kl_132.

 

Oracle Data Visualization (DVD/DVCS) Implementation for Advanced Analytics and Machine Learning

$
0
0

Oracle Data Visualization Desktop(DVD) or Cloud Server(DVCS) is a very intuitive tool, which helps every business user in the organization to create quick and effective analytics very easily. People at all level can leverage the benefit of blending and analysing data in just a few clicks and help the organization to take informed decision using actionable insights. Oracle DVD is a Tableau like interactive tool which helps to create analysis on-the-fly using any type data from any platform, be it on premise or Cloud. Main benefits of Oracle DVDs are below:

·         A personal single user desktop tool, or a SAAS cloud service, which can be leveraged by any business user in the Organization.

·         Enable the desktop user to work even offline

·         Completely private analysis of heterogeneous data

·         Business user can have entire control over the dataset/connections

·         Direct access to on premise or cloud data sources

·         Administration task has been removed completely

·         No concept of remote server infrastructure

Oracle DVD/DVCS enables the business user to perform analysis using traditional methodologies as well as provides capability to perform Advance Analytics using R and creating Predictive model using Machine Learning algorithm using Python.

This simple and intuitive tool provides a very unique way to enable you to perform Advance analytics by just installing all the required packages. DVML (Data Visualization Machine Learning library) is the tool to help you install all the required packages for implementing machine learning algorithm for predictive analysis in one go.

Install Advance Analytics(R) utility will help you to install all the required R packages to perform Advanced Analytics functions like Regression, Clustering, Trend line etc. However, to run both the utility in your personal system/server, you need administrative access as well as access to internet and permission to automatically download all the required packages.


In the below slides we are going to discuss, how to leverage Advance analytics and machine learning functions to provide predictive analytics for the organization.

In order to create a Trend line graph, we need to enable Advanced Analytics and then pull required column into the Analysis.

Trend line Function: This function takes 3 parameters to visualize the data in a trending format.

Syntax: TRENDLINE(numeric_expr, ([series]) BY ([partitionBy]), model_type, result_type)

Example : TRENDLINE(revenue, (calendar_year, calendar_quarter, calendar_month) BY (product), 'LINEAR', 'VALUE')

We need to create various canvases and put them into one story line by providing corresponding description over the canvas. While creating Trend line visualization, we need to provide the Confidence level of data. By default, it will take 95% confidence level, which means the analysis will be performed over the 95% of data.


Mitigating Low User Adoption in Sales Automation

$
0
0
So the project went live perfectly. It was on-time, on-budget and all the success criteria were met. Except one. This is a nightmare scenario for many project managers and sponsors, when the sales automation project that they have worked so hard on for many months and executed perfectly (in their opinion) does not seem to enthuse the end users resulting in the very common problem of Low user Adoption. In this blog we are specifically talking about low user adoption related to Sales Automation projects although many aspects could be common with other endeavors as well.

Below are the major causes and mitigation for Low User adoption:

Lack of Process Adherence or "We don't work that way"

Often in the hurry to implement 'best practices' and a 'vanilla solution', short shrift is given to some core processes in the sales organization. Sometimes, in a global implementation, processes are 'standardized' without a real buy-in from regional stakeholders who may perceive that their way of doing business has not been heard sufficiently. 
Mitigation: Get explicit sign-offs and buy-in from stakeholders when processes get modified. Build in customizations where required to ensure core processes are protected.

Lack of Trust or "Is my data secure?"
Another reason that your sales reps are reticent in sharing information on the sales automation application is due to Lack of Trust. For sales reps, their contact and account information is gold. They do not want just anybody in the organization having access to their contact and account details. Sales Teams may not have a problem with their managers accessing data but may not want say, the marketing team to get the access to their contact details without their knowledge. If their misgivings in this regard are not addressed, you will find that they may not be updating their most important information. 
Mitigation: Most software today comes with configurable security parameters. You should ask your SI to implement suitable security configurations that balance both the need for sales effectiveness and address the trust issues of your sales teams. 

Lack of Business Commitment or "Even my Manager doesn't use it"
Many times Sales Automation projects focus only on the direct sales reps as the end user. This is a mistake because although direct sales reps may form the largest part of the sales force, when other sales roles like sales managers, partner managers, key account managers are not included, it is perceived by the direct sales team that they have been saddled with an unnecessary burden. This results in them not taking the implementation seriously thus resulting in low user adoption.
Mitigation: It is important that companies take a strategic view when it comes to sales automation and implement functionalities in the software that benefits their entire sales organization. Hence we recommend to implement modules like Sales Forecasting management which requires sales managers to review forecasts from their reps and in turn submit to their managers. Modules like Partner Relationship Management are used by Partner managers to manage sales processes using the partner organization. Customer Data Management and Incentive Compensation functionalities involves the sales operations teams to ensure data quality and sales incentives through the sales automation product.

Lack of Knowledge or "Not sure how it works"
Most SIs and implementation consultants work on the "Train the Trainer" model were key users from the sales organization are trained on various aspects of the application. It is then expected that these Key Users in turn will go back and ensure a quality End User training. Many companies ignore or do not pay enough attention to this phase of the project since vendors are not involved in this process. It is not surprising then that end users forget about the inadequate training they received and go back to their old way of doing things.
Mitigation: It is important that enough thought is put into the End User trainings as well. If the number of end users are large, it should be treated as a separate project and vendors can be involved in this phase of the project as well. Enough and appropriate training collateral should be developed and rollout should be planned so as individual attention can be given to each participant in these training sessions. Follow up or refresher training can also be organized on a need basis.

Lack of Productivity or "The Old way was better"
Although sales effectiveness, improved sales reporting, sales collaboration are all important reasons to implement a sales automation application, user adoption of such software will suffer if sales reps feel that all these benefits are happening at the cost of their productivity. Companies should guard against building a 'comprehensive solution' as that may mean that sales reps have to spend more time on the application when they would rather be selling and having face time with their prospects and customers.
Mitigation: Sales Productivity should be an important success criteria metric and included as part of all requirements and design conversations. Data entry requirements should be kept to the minimum mandatory fields and rest of the fields can be optional. Application performance should be tested comprehensively to ensure that improvements can be made before go-live. Mobility and Outlook Synch functionalities should be explored to improve productivities. 

Lack of Perceived Value or "What's in it for me?"
This is perhaps the most important question that needs to be answered in terms of user adoption. Unless the sales automation helps the sales reps to meet his personal career goals, he is not going to spend time on the application. It is important that he perceives the application as a tool that will improve his sales effectiveness, help him get recognition, and advance his career.
Mitigation: Sales automation software should focus on sales effectiveness improvements which could mean sales collaboration, new technology interventions like AI-ML to help the salesrep focus on the important leads and improving his win rates. Intelligent analytics that provide not just information but also insights on his key concerns and suggesting a workable plan of action. Sales Performance and Gamification solutions can work on top of the base solution to provide value in real terms to the sales users.

Keeping Track
It is important to measure the User Adoption through analytical reports to try and understand the status of user adoption even after applying many of the above mitigation measures. Reports should give an adoption breakdown by region, role etc to answer questions like which sales roles are using or not using the application. Which country's users are lagging behind? Answers to such questions will help the IT organization to take suitable interventions and corrective measures. All the best on your user adoption journey!


Automating the Purge Job for Usage Tracking Data

$
0
0
 

Why Usage Tracking?

Usage Tracking will be helpful in measuring and monitoring user interactions with the OBIEE. This will provide the deep understanding into usage statistics and performance congestions in the reporting application.

Usage tracking functionality creates entry into S_NQ_ACCT table as when a report is executed in the application by a user.

This table will capture metrics like report performance time, report start/end time, user ID etc.,

When Usage tracking is enabled, it helpful in determining which user queries are creating performance bottlenecks, based on query response time.

It also provides information on frequently accessed reports. It involves in Enterprise Manager set up changes and RPD changes.

Why Automate the Data Purge?

For a reporting application which receives user requests every minute, Usage Tracking will generate huge volume of data. This data gets written in S_NQ_ACCT database table. Purging data in this table periodically is essential. Otherwise reports created on top of usage tracking data would perform slowly. Manually purging this data requires intervention from database team and add overhead in application maintenance.

We can automate data-purging in S_NQ_ACCT table using BI Publisher technology. This automation will work for any data-purging. Also the entire automation can be done with technology stack that exists with BI application. There is no need to involve any additional tools.

Steps to Automate:

  1. Create a BI Publisher data source with READ WRITE user for OBIEE meta-data schema.

  2. Create a database package which deletes records from S_NQ_ACCT table.

  3. Create a BI Publisher data-model to invoke the DB package via Event Trigger.

  4. Create a scheduled job which will invoke the data-model periodically.

 

  1. Create a BI Publisher Data Source

     

Go to BI Publisher administration, Click on JDBC connection as shown below.

Click on "Add Data Source"

 

 

 

 

Enter the following details for a New Connection

     Data Source Name: Give a Data source name.

    Username: Enter the user name who have Read and write access to OBIEE meta schema to access the data source.

     Password: Enter the password associated with the user name.

Click on Test Connection, a confirmation message will be displayed.

  1. Create a Data Model to know how many records got purged.

    Go to New TabàData Model

     

 

 

 

Enter the following details

Default Data Source: Select the Data source which is created in above step from dropdown.

Oracle DB Default package: Enter package name which is created in the database in the OBIEE Meta schema.

 

Attached the package code for Reference.

 

 

 

Click on Data Sets and select SQL Query as shown below.

 

Enter the following Details

Name: Enter a Data set name

Data Source: Select the newly created Data source from the dropdown

Write the Query and click on OK.

Query: SELECT COUNT(1) num_deleted_records FROM APLINST02BI_BIPLATFORM.S_NQ_ACCT WHERE start_dt < SYSDATE - (:m)

 

Create an Event Trigger to initialize the Data model after Report gets triggered by the Scheduler.

Enter the Event trigger details as below

Name: Enter the name of the Event Trigger

Type: After Data

Language: PL/SQL

Oracle Default package will be populated automatically. Select the appropriate function which will trigger the Report from the Available function section and move to the Event Trigger section by Click on ">" icon

 

Now click on Parameters and provide the parameter details to pass it in Event trigger. We are passing the number of days with this parameter to purge the data from the S_NQ_ACCT table with below logic.

DELETE FROM Schema_Name.s_nq_acct WHERE start_dt < sysdate - m;

 

 

 

  1. Create a RTF template for Scheduling a Job to automate

Go to NewàReport

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Click on Upload

Rtf Document for reference.

 

 

 

 

Save the report in the required folder after providing the above details.

Now click on MoreàSchedule

Enter the parameter value as 'm'

 

 

 

 

Enter the details for scheduler as below

Name: Give a name of the file name

Layout: This will be populated automatically

Format: Select a required format of the Report form the dropdown

In the Destination section, select the Destination as Email or FTP and provide the details accordingly.

In the Schedule tab, give the frequency of the Job when to run.

 

Now click on Submit Job from top right corner. A job will be scheduled as per given details below

OAC Essbase Application Migrations from On-Premise

$
0
0

OAC Essbase Application Migrations from On-Premise:

Introduction

I am working on the Oracle EPM Cloud implementation project which is focused on migrating on-premise Essbase applications to Oracle Analytics Cloud (OAC) Essbase cloud applications. The OAC Essbase provides few utilities which can be used for exporting applications from On-Premise application and importing it into OAC environment. This document explains about how to migrate the Essbase applications to OAC using utilities.

 

Utilities

Below are utilities which is downloaded from OAC environment. I have downloaded the  Export Utility and Command Line Tool to my local machine for migrating on-premise applications to OAC Essbase.

                1. EssbaseLCMUtility

                2. CommandLineUtility

 

Utilities.png

 

Prerequisites

       To use the Essbase LCM utility and Command line interface, Java JDK 8 should be installed and the JAVA_HOME path should be set.

       On-Premises Essbase applications should be converted to Unicode mode (UTF-8 encoding) before migrating into OAC Essbase environment. I used the below maxl script to convert the application into Unicode mode.

I have changed the Server level variables to Application level variables.

Exporting On-Premise Application

       I have followed the blow steps to export the on-premise application using Essbase LCM Utility.

       Open the 'CMD' and change the directory to 'D:\EssbaseLCMUtility' where I have downloaded the utility.

       Run the below command to export the application from Essbase. This command exports the data and artifacts.

                EssbaseLCM.bat export -server  test.xxx.local:1423 -user TestAdmin

                -password ******** -application TEST -zipfile     TEST.zip -skipProvision

 

Export Application - Progress

 

Export.png

Application Export Folder

-          The Application folder is exported to EssbaseLCMUtility Folder.

ExportLocation.png

 

Importing the application into OAC

I have manually copied the exported application folder to CommandLineUtility home folder.

 

CommandLineUtility.png

 
 

 Importing the application to OAC

·         I have executed the below CLI commands to import the application folder into OAC Essbase.

·         Set the Java Home and CLI Home:

                                SET CLI_HOME=D:\CommandLineUtility

                                SET JAVA_HOME=C:\Program Files\Java\jdk1.8.0_161

·         Logging into OAC:

                                esscs login -user TestAdmin -url https://test.xxx.com/essbase

·         Importing Application into OAC:

                                Esscs lcmimport -v -zipfilename TEST.zip

 

Import Application - Progress

 

Import.png

 

Application in OAC Environment

Application migration is succeesful. Go to OAC application console and refresh the application list to see the migrated application.

 

Console.png

 

Reference(s):

1. https://docs.oracle.com/en/cloud/paas/analytics-cloud/essug/preparing-migrate-premises-applications-cloud-service.html.

2. http://www.redstk.com/migrating-an-essbase-cube-to-the-oracle-analytics-cloud/.

 

Gamify your Sales Automation (Baseball theme example included!)

$
0
0
Looking for a way to bring some excitement, motivation and a sense of competition within your sales force but do not necessarily want to spend extra dollars on incentive payouts? Do you want to improve user adoption of your sales automation application but not spend time and effort on retraining or having to listen to complaints from the sales team about how the system is no good? Gamification may be the answer that you are looking for.

Gamification in sales automation refers to creating game like scenarios which include principles like garnering points, rankings, competition etc. to motivate your sales teams in a non- monetary way, although in some cases, points earned may also be redeemed for non-cash incentives if the organization so chooses. The objectives for gamification may be manifold:

Process adherence Organizations may have trouble getting their sales teams to follow recommended sales processes. Examples may include updating contact information or capturing minutes of meeting with clients. Such activities may even seem to be trivial to sales managers who do not wish spend time discussing these items with their teams and who might rather spend their time discussing more 'important' matters like specific opportunity details, sales forecasts, pipeline review etc. Gamification can address such situations effectively by reinforcing ideal behavior through reward of points to salesreps who follow the recommended sales process.

User Adoption Organizations implement sales automation software only to find that their sales teams couldn't be bothered to use them. Gamification can be a reason for the sales reps and managers to start using the application and lead them in understanding the benefits of sales automation.

Sales Engagement Sales resources tend to work in isolation. They are on the road constantly meeting clients and prospects and there isn't enough time to build employee engagement. Any internal office meeting tends to be formal reviews and planning exercises which can be quite serious affairs. Gamification can help to reduce tensions within sales teams, bring some fun into an office culture and bring about some good-natured competition and a feeling of 'know thy team'.

Gamify Sales Activities

Below are some models or examples of how a simple gamification can be designed for routine sales activities using a points system. Salesreps can be notified on their points accumulated and also be ranked vis a vis other sales reps.

Gamify activities performed on Lead and Opportunity objects by assigning them suitable points. For example

  • Creating a Lead gets you 1 point.
  • If the Lead is Qualified, you get 2 Points.
  • If the Lead is converted to an Opportunity, you get 3 points, and so on.
Similar gamification can be performed on Account and Contact objects. For example,

  • Creating an Account gets you 1 point
  • Adding a contact to the account gets you 2 points
  • If the contact is a decision maker (Title VP or higher), you get 3 points and so on.
Gamify Sales Performance Metrics

Sales performance metrics can be mapped to sports themes. Below is an example of mapping them to a baseball theme. Such gamification can then be included as part of the salesrep's profile which is viewed by everybody in the organization. Similar themes around other sports or games can be creatively designed.

  • Batting Average- % of Leads that get converted to Opportunity. A batting average of 0.250 means 1 out of 4 Leads are getting converted
  • On Base Percentage- % of Leads that get converted to Opportunity but also includes walks (standalone opportunities)
  • Slugging Percentage- % of Won Opportunities upon total opportunities (A slugging percentage of 0.500 means 50% of Opportunities are won)
  • Home Runs- Number of high value opportunities won (say above 10k)
  • Hits- Number of Opportunities that reached a particular sales stage (say Submit Quote)
  • Runs Scored- Number of Opportunities Won
  • Assists- Number of Opportunities won where the salesrep is not the owner but on the sales team
  • Errors- Number of Stale Opportunities
Note that control functionalties may have to be built in to the game mechanics to ensure that sales users don't enter dummy or wrong data to win points or to score more.

Hope the above gave you some ideas on how you can gamify your sales teams!

Oracle Exadata, MPP Databases or Hadoop for Analytics

$
0
0

INTRODUCTION

There is a plethora of databases today for example SQL databases, No SQL databases, Open source databases, Columnar databases, MPP Databases etc. Oracle which is a leader in the relational databases space is often compared to these. So let us look at some basic differences between Oracle and some other players like MPP databases (Teradata, Vertica, Greenplum, Redshift, Netezza etc). and Hadoop for various Analytics workflows


ARCHITECTURE

Oracle, Teradata, Vertica, Greenplum, PostgresSQL, Redshift and Netezza are all Relational Databases. However, Teradata, Vertica, Greenplum, PostgresSQL, Redshift and Netezza are massively parallel processing databases which have parallelism built into each component of its architecture. They have a shared nothing architecture and no single point of failure. On the other hand, Oracle database has a shared everything architecture. Even Exadata (which is Oracle's engineered system or Oracle's database appliance specifically for Analytics or OLAP) is based on existing Oracle engine which means any machine can access any data which is fundamentally different from Teradata as shown in diagram below. Thus MPP databases are able to break a query into a number of DB operations that are then performed in parallel thus increasing the performance of a query

T

his brings us to the next logical question of how are these MPP databases different from Hadoop? Hadoop is also an MPP platform. The more obvious answer would be MPP databases are used for structured data while Hadoop can be used for structured or for unstructured data with HDFS - a distributed file system. Also, while MPP databases introduce parallelism mainly in storage and access of data.  Hadoop, with Map Reduce framework is used for batch processing of large amounts of structured and unstructured data more like an ETL tool. So it is a data platform


MPP_Oracle_Architecture_small.png













USER INTERFACE

Oracle and most MPP databases use SQL interface while Hadoop uses Map reduce programs or Spark which are java based interfaces. Apache HIVE project however is aimed towards introducing a SQL interface over Map Reduce programs.


INFRASTRUCTURE

The other difference between these systems is that most MPP databases like Teradata and Oracle Exadata run on propriety hardware or appliances while Hadoop runs on commodity hardware.


SCALABILITY

Oracle Exadata and most MPP databases scale vertically on propriety hardware while Hadoop scales horizontally which results in a very cost effective model especially for large data storage


STORAGE

The MPP databases use columnar data storage techniques while Oracle uses row wise storage which is less efficient in disk space usage and also in performance to columnar storage. However, Oracle Exadata uses Hybrid Columnar Compression (HCC) which is an aggregate data block created above the rows of data. The compression is achieved by storing the repeating the values only once in the HCC. Thus performance of Oracle Exadata is considerably better than row wise storage Oracle database. Hadoop on the other hand supports HDFS which is distributed file storage


USE CASES

Oracle is often the choice of database for Analytics where Oracle ERP systems are deployed. Oracle Exadata can meet OLAP workflow/ DSS requirements and has many Advanced Analytics options. More details can be seen at Oracle's Machine Learning and Advanced Analytics 12.2c and Oracle Data Miner 4.2 New Features.

 

Teradata is the choice of DB in case of pure OLAP workflows with its massively parallel processing capabilities especially when data volumes are high. Teradata is also the preferred choice in case of low latency analytics requirement where an RDBMS is still required However, it is losing market share as Teradata migration is a priority for most cost conscious CEOs due to its prohibitive year on year expense. Another reason for migration off Teradata is the adoption of new generation data analytics architecture with support for unstructured data.

 

The above sets the stage for Hadoop with its support for big data which can be structured or unstructured. It provides a platform for data streaming and analytics over large amounts of data coming from IOT sensors, social data from various platforms, weather data or spatial data. It is based off open source technologies and uses commodity hardware which is another attraction for many companies moving from Data warehouse to data lake ecosystem.

 

CONCLUSION

Thus, it is important to consider the Use case a database under consideration is designed to serve before deciding the best fit for your Big Data ecosystem. Making a decision solely on amount of data (Petabytes or terabytes) that need to be stored might not be accurate. The other factors that can influence one's decision might be your overall IT landscape/ preferred Infrastructure platform, developer skills, cost, future requirements which is specific to each individual organization. So though new age databases are opening new opportunities for data storage and usage, the traditional RDMS will most likely not go away in the near future.

 

REFERENCES

https://docs.oracle.com/cd/E11882_01/server.112/e17157/architectures.htm#HAOVW215

https://downloads.teradata.com/blog/carrie/2015/08/teradata-basics-parallelism

 

 

FCCS Integration with Oracle Fusion Financials - End to end process and pain points

$
0
0

The integration of FCCS (or any other Hyperion cloud application) with Oracle Fusion Financials (Oracle GL cloud) is said to be a "direct" integration. However, when you start to configure it, you realize that it's not as "direct" as it appears J

So here I am, explaining the steps involved and the points to note while setting up this integration right from configuring the connection to Fusion Financials up to the Drill Through from FCCS back to Fusion Financials.

1.      Setup Fusion Source System in FCCS Data Management

Setup the Source System of type Oracle Financials Cloud as you would normally do.

Note: For the Drill Through URL make sure to enter the Fusion Financials Cloud release URL format - "R12" for release R12 or earlier and "R13" for R13 release format.

2.      Create a User in Fusion Financials to establish connectivity

Create a new user in Fusion Financials with the correct roles. This user will be configured in FCCS Data Management setup.

Note: The user has to be assigned the following roles:

i.                     Employee

ii.                   Financial Analyst

iii.                 General Accountant

iv.                 General Accounting Manager

3.      Configure Source Connection in Data Management

After the user is created in Fusion Financials, go back to Data Management and configure the source connection with the user created.

Enter the Web Service URL of the Fusion Web Service and click on Test Connection.

Once the connection test is successful, click on Configure to save this configuration.

4.      Initialize the Source system

Select the Oracle GL source system and click on Initialize. Initializing fetches all metadata from Fusion GL, that is needed in Data Management, such as ledgers, chart of accounts etc. The initialize process may take some time. You can monitor the progress on the Workflow tab under Process Details.

5.      Period Mappings

After the Source System is initialized successfully, an Essbase application with the same name as the Fusion GL application gets created in Data Management. All the metadata from Fusion GL is fetched into this application. The next step is to configure the Period Mappings for this application.

It is setup under the Source Mapping tab for both Explicit and Adjustment Period Types.

Select the Fusion application name as the Source Application and FCCS application as the Target. Add the period mapping.

Similarly, to bring in Adjustment data from Fusion GL, create the period mapping for the adjustment periods. Select Mapping Type as Adjustment and create the period mappings.

After this initial setup is complete, you are ready to create the Locations to import data from Fusion GL. The standard process of creating an Import format, Location, Data Load Rule and Data Load Mappings can be followed to create the load locations. However, the Data Load Rule setup differs from the file based loads as it has source system filters which can be setup to filter/limit the data you would like to import from Fusion GL.

6.      Data Load Rule setup

Open the Data Load Rule for the Fusion GL location. It has most of the fields setup with the default values. You may change the filter conditions per your import requirements and setup the data rule.

Review the source filters for each dimension and update as required.

With this, your "Direct" Integration setup is complete and you are ready to import data from Fusion GL and Export to FCCS.

7.      Drill Through setup

A very important feature of the direct GL integration is the ability to drill through from FCCS to Data Management right up to Fusion GL.

In Smart View, when you drill through for a particular amount, it takes you to the Data Management landing page.

When you right click on the amount on the DM landing page and select Drill Through to Source, it will take you to the GL Balance Inquiry landing page to see the details of the individual transaction records.

Some pre-requisites to perform drill through successfully:

Note:

i.                     The user performing the drill through needs to have Data Access Set assigned in order to view the Inquiry page. Without this, you get an error saying Invalid Data Access Set.

To assign Data Access Set,

·         Login to the Fusion GL application. On the Home Page, go to Setup and Maintenance.

·         Search: Manage Data Access for User

·         Query the user name for which this access is to be granted.

·         Add the Data Access Set for the roles assigned to this user

 

ii.                   Another pre-requisite is that the user has to be already logged in to Fusion GL while performing Drill Through. If the user is not logged in and if he/she clicks on Drill Through to Source, you get an error "You can only drill from Smart View into detail balances but not to Account Inspector".

8.      Automation of GL Integration

The Fusion GL loads can be easily automated since they are not dependent on the presence of a source file.

You may automate it through Data Management by creating a Batch Definition and specifying the Data Load Rule.

You may also automate it completely without having to login to Data Management using EPM Automate and Windows batch jobs. This automation is explained in detail in my blog here  - http://www.infosysblogs.com/apps/mt.cgi?__mode=view&_type=entry&blog_id=29&id=10580


Oracle Cloud R13- One Time Payment Request

$
0
0

Oracle Fusion Financials Cloud R13 offers functionality to import invoice and payment details as Payables Payment Request from external systems using a predefined FBDI template

Supplier for the request

ü  Not an existing supplier in Oracle

ü  Entered as a Party

ü  Cannot be queried at Manage Supplier page

ü  Cannot be reused for standard invoice

ü  Bank Details (Account Number, Bank, Branch) are required to be entered in the import data

FBDI Template - Payables Payment Request Import

ü  The 18A Template can be downloaded from below link

Payables Payment Request Import

ü   Key Template Columns 

Column Name

Details

Transaction Identifier

Invoice identifier to uniquely identify the OTP request

Party Name

Name of the Supplier/Party

Party Original System Reference

Reference information of party from the source system

Party Type

Supplier Type.

Party type can only be Person, Organization, Group or Relationship.

Location Original System Reference

Source System Reference for location.

Country

Country of Party Address

Address Line 1, 2, 3, 4

Address Line 1, 2, 3, 4 of Party Address

City, State, Postal Code

City, State, Postal Code of Party Address

Supplier Bank Details

Account Country, Currency, Account#, Type, Bank & Branch #

Business Unit & LE

Business Unit and Legal Entity

Source

One Time Payments (This is defined at Payables Lookup, Type= Source)

Invoice Details

Invoice Number, Date, Currency, Description, Paygroup, Payment Term

Payment Method

Mode of Payment

Liability Combination

Liability Account Code Combination.

If left blank, value of this field defaults from the setup.

 

Mandatory Setups

1.       Enable One Time Supplier feature for the instance (View image)

2.       Add the OTP Code as Payables Source (View image)

3.       Add the source at Trading Community Source System (View image)

4.       Enter default Liability and Expense accounts at 'Manage Common Options for Payables and Procurement' (View image)

5.       Enter default location at 'Manage Business Unit' (View image)

Creating Payment Request Invoice and Payment

1.       Prepare the FBDI template with Payment Process Request data (View image)

2.       Generate csv and zip files from template (View image)

3.       Upload the zip to UCM Server. Account- fin/payables/import (View image)

4.       Run 'Load Interface File for Import' process to load the data to interface table (View image)

5.       Run 'Import Payables Payment Request' process. Source: OTP (View image)

6.       Invoice created (View image)

7.       Payment can be made by selecting 'Pay in Full' from Invoice Action or Creating a new Payment at Manage Payments.

 1099 Reporting for Payables Payment Requests

1099 reporting is not supported for One Time Payments. Assumption is that the source application generating one-time payments would handle any tax requirements.  If payments handled within Oracle Cloud Financials require 1099 then the supplier needs to be created in Oracle and paid by invoice.

 

Customer Loyalty: Past Forward

$
0
0
Back in the 80's when televisions were introduced to the households, the marketers made an easy entry inside our home and a new era of visually animated marketing begun. The trick was simple, "be visible be sold ". Suddenly TV took the centre stage for all the marketing and promotion. Every brand, premier or not, wanted to connect with its customers and engage them via attractive advertisements appealing to their physical or cognitive needs. Since we as customer were generally not informed about the market and its offerings, anyone who could educate us about our needs and show us an available product for it could successfully seal the deal. In short it was about educating -> engaging -> selling for a long time only till market was all levelled.

Leveraging Oracle Revenue Management Cloud System to Meet IFRS 15 Contract Cost Amortization Requirements

$
0
0

Oracle Revenue Management Cloud Service (RMCS) - Introduction

Oracle Revenue Management Cloud Service (RMCS) is an automated and centralized revenue management product that empowers organizations to comply with the ASC 606 and the Accounting Standard IFRS 15 requirements of revenue from contracts with customers. RMCS helps organizations in automating the identification and creation of customer contracts and performance obligations, their valuations, and the accounting entries through a configurable framework.

RMCS is tailor made to meet the IFRS15 / ASC606 requirements including the transition requirements. Apart from this, RMCS also provides robust integration with third party applications including Oracle EBS (and other non-oracle systems) to fulfill the requirements of IFRS-15.

Standard RMCS features enable organizations to recognize revenue from contracts with customers as per IFRS-15. However, the product does not offer features to amortize the contract costs as per IFRS-15. At Infosys we have extended the usability of RMCS to recognize and amortize Contract Costs as per IFRS-15.

This document provides a solution overview on recognizing and amortizing Contract Costs in RMCS and the initial accounting setups which are required. The content included in this document is industry or organization agnostic.

 

Recognizing Revenue from Contracts with Customers in RMCS

The most striking change in recognition of Revenue has been the introduction of the new five- Step Model for recognition of revenue.

View image

 

The new standard impacts all the organizations requiring to report according to IFRS and US GAAP. Since, the change deals with revenue, and is expected to have organization wide impact.

Sample case showing how Revenue is recognized as per IFRS-15 in RMCS: -

Domain: Telecom

Contract Period: 6 months

Contract Start Date: 15.01.2018

Contract End Date: 15.07.2018

Plan: Telecom plan which include monthly fixed fee- $ 100 along with a handset at the start of the plan

Say, X Ltd sells separately Handset at $ 300 for 6 months and the monthly fee without handset at $ 80.

1.       As per IFRS-15, X Ltd needs to identify the contract (Step 1) which is a 6-month contract with the customer.

 

2.       Then, X Ltd needs to identify all the performance obligations (Step 2) from the contract with the customer which is:-

·         provide a handset

·         provide network services over 6 months

 

3.       Decide the transaction price (Step 3) which is $ 600

 

4.       Apportioning the transaction price (Step 4) of RS 600 to each performance obligation based on their relative stand-alone selling price: -

Performance Obligation

Standalone Selling Price

Allocated %

Allocated Revenue

Revenue Recognized

Handset

300

38.46%

230.76(600*38.46%)

230.76

Network Services

480(80*6)

61.54%

369.24(600*61.54%)

61.54(369.24/6)

Total

780

100%

600

292.30

 

5.       Recognizing the revenue (Step 5) when X Ltd satisfies the performance obligations:

·         Recognizing the Revenue from Handset, when X Ltd gives Handset to customer -$ 230.76

·         Recognizing the Revenue from Network Services provided for $ 61.54 monthly during the period of the contract which is 6 months.

Expected Accounting entries generated in RMCS are as summarized below: -

Period

Description

Amount

Debit

Credit

Event

T0

Recognition of Contract Asset and Liability in relation to the Services and Handset

Contract Asset

600

 

Initial Performance

Contract Liability

 

600

T1

Monthly Billing of Revenue 

Contract Clearing

292.30

 

Performance Obligation Billed

Contract Asset

 

292.30

T1

Recognizing of monthly Revenue and its allocation

Contract Liability

292.30

 

Performance Obligation Satisfied

Revenue from Handset

 

230.76

Revenue from Network Service

 

61.54

T2-T6

Monthly Billing of Revenue 

Contract Clearing

61.54

 

Performance Obligation Billed

Contract Asset

 

61.54

T2-T6

Recognizing of monthly Revenue and its allocation

Contract Liability

61.54

 

Performance Obligation Satisfied

Revenue from Network Service

 

61.54


Accounting for cost as per IFRS-15

IFRS 15(Revenue from Contract with Customers) is primarily a standard on revenue recognition, it also has requirements relating to contract costs. As a result, organizations may require change their accounting for these costs on adoption of IFRS 15.

Prior to IFRS 15, there was no specific accounting standard addressing the accounting for costs, entities referred to a number of different standards and principles in accounting for various types of costs incurred. Existing standards IAS 18 Revenue and IAS 11 Construction Contracts contained only limited guidance, mainly on applying the percentage of completion method (under which contract revenue and costs were recognized with reference to the stage of completion).

IFRS 15 introduces a new guidance on accounting for the costs related to contract: -

View image

Basic Configuration required to achieve allocation in RMCS

 

  1.                    Trading community source system: Source systems are uniquely defined in the system which is required to support all other setups.
  2.  

                    Source document codes: Source document code is base setup to define source document types.
  3.                 Source document types: Source document types are defined to indicate different lines of business. For example: if a business has manufacture line and service line then two type of source document types need to be defined.
  4.                 Revenue system options: Different types of accounts i.e. contract asset account, contract liability account, contract discount account, price variance account and contract clearing account are defined through revenue system options.
  5.                   Standalone selling price effective periods: Depends on pricing policy of the business and how frequently these changes helps to determine effective periods to define standalone selling price of different products.
  6.               Contract identification rules: Contracts are created in system based on the contract identification rules. Different contract identification rules are created for different source document types.
  7.             Performance obligation identification rules: These rules are created to define how different performance obligation lines will be treated for a particular contract. Different performance obligation rules are created for different source document types.
  8.           Pricing dimension structure: Pricing dimension structure are used to define different segments which are required in defining pricing dimension values.
  9.               Pricing dimension assignments: This setup assigns different source document types with different pricing dimension structures.
  10.                   Standalone selling price profile: This setup is done to define items in different standalone selling profile and to define standalone selling price. Profiles are created based on different pricing dimension assignments.

Extending the Usability of RMCS to Costing Scenario

In order to recognize Contract Cost as per IFRS-15 (If the contract period is for more than 12 months), organizations need to amortize contract cost over the period of the contract. In this scenario, Contract Cost should be debited and Contract Asset should be credited. But initially in RMCS, while recognizing revenue Contract Liability Account is debited and Contract Revenue Account is credited. Therefore, in order to achieve the accounting for Contract Cost we need to apply Sub Ledger Accounting in RMCS.

Sub ledger Accounting supports multiple accounting representations concurrently in a single instance. We can create a particular set of rules for specific transactions and create accounting for the transaction with the accounting methods defined.

 

Configurations Required for Sub Ledger Accounting to Meet Cost Amortization

This is how best we can understand the relationship of the components used for Sub Ledger Accounting: -

View image

After completing basic RMCS configuration, below are the high level SLA setups required to achieve Costing Scenarios:

        I.            Accounting Method: We need to create a new Accounting Method so for accounting treatment for each accounting event class, accounting event type for the Costing Scenario.

      II.            Account Rules: Account Rules enable us to define the logic of determining the segment value to be used for each transaction. We create different rule types to fetch Account combination, Segment, and Value Set.

In order to create accounting entries for costing scenario, we need to create three different Account rules, namely: -

a.       Contract Liability Custom Account Rule

b.      Contract Asset Custom Account Rule

c.       Contract Revenue Custom Account Rule

Each of the above rules should have a condition to identify and alter the account for cost contracts.

 

    III.            Sub ledger Journal Entry (JE) Rule Set: Sub Ledger Journal Entry Rule Sets enables us to generate complete JE for an accounting event. This summary of the set of rules needs to be validated before it can be linked to the Accounting Methods for the sub ledger. The Sub Ledger Journal Entry Rule Set can be assigned only to a Sub Ledger Accounting Method with the same chart of accounts. Before creating Sub Ledger Journal Entry Rule set, ensure that the below subcomponents, if required, of Sub Ledger Journal Entry Rule Set are correctly defined: -

a.       Description Rules

b.      Journal Line Rules

c.       Account Rules

 

    IV.            Accounting Methods assignment: After creating Sub Ledger Journal Entry Rule Set, assign the Rule set with the already created Accounting Method. The Status of the Accounting method defined is incomplete initially.

      V.            Activate Sub Ledger Journal Entry Set Assignments: In order to activate the accounting setups, submit the Activate Sub Ledger Journal Entry Rule Set Assignments process. The Status of the Accounting Method should be 'Active'.

Expected Accounting Entries in RMCS for Costing Scenario: -

Say, Contract Cost to be amortized -$ 2400 for 24 months

Period

Description

Amount

Debit

Credit

T0

Initial Performance

Contract Asset

2400

 

Contract Liability

 

2400

T1-T24

Amortization of Cost

Contract Cost

100

 

Contract Asset

 

100

Conclusions-RMCS with Custom Sub Ledger Accounting - A Game Changer!!!

With Oracle RMCS and Infosys tried and tested SLA extensions, now organizations can not only recognize Revenue according to IFRS 15 but also recognize and amortize contract costs as per IFRS 15. This enables organization to have a single Oracle supported solution for meeting all their IFRS15 requiremnts.

Please reach out to Infosys/authors for your organization specific requirements and to leverage the power of RMCS to ease your transition to IFRS 15 standard.


Optimising Omni-Channel order Interface - Batch or individual mode?

$
0
0

In a world where the attention span of customers is fast reducing, the longer the response time the greater the probability of losing the customer. The customer expects their order to be updated at appropriate time and wants the know the status in minutes at least, if not seconds. In this context, organisations handling sales order in an Omni-channel order processing, face a dilemma on whether to wait and process the sales orders in batch mode to optimise the load on applications or process the orders immediately as individual orders to service the customers faster? This blog tries to unearth this mystery by searching for various factors that might affect this decision.

 

Batch Vs Individual 1.jpg

                                                                                                                          Pic 1 - Batch or Individual mode

What's what:

In a true Omni channel scenario, sales orders are created via various means like ecommerce web site, an app in mobile or tab, traditional order entry means, customer service team order entry, orders entered in retail stores, orders via EDI, orders created in kiosk at shopping malls etc. While order capture systems are many, there is traditionally one fulfillment application (For example, Oracle Fusion cloud). There are 2 ways of interfacing, enriching, importing and processing these orders in the fulfillment application. Let us look at a theoretical definition of the options available

  • Batch mode: Interface all sales orders from various applications into one single repository. At a specified time enrich, import and process the orders as one batch by running batch processes. In order to improve the performance, static or dynamic batch (batched as per order type or order source) is used
  • Individual mode: Interface, enrich, import and process every single sales order from various applications individually i.e. order by order. In order to improve the performance, multi-threading is used

 

Following are few of the key considerations before the Sales order interface architecture is decided

1. Order cycle:

The order life cycle can range from weeks (for example in hi-tech industry) to just a few hours (typically in retail industry). But irrespective of the industry, with the advent of digitalization, the life cycle of orders is fast decreasing. The length of the order cycle needs to be considered before the selecting the mode of order import.

Verdict: Short life cycle à Import individual sales order

2. Order volume, order clock and seasonal:

2.1 Volume: In retail industry, the volume of sales orders is quite high. For example, there are retail customers creating 10,000 to 4 Mil sales orders per day. When the volume is high, it is a general trend to choose to batch the sales orders and import them at regular frequency during the day. This will optimize the utilization of server resources.

Verdict: High volume à Chose batch mode

 

2.2 Order clock: Order clock is the distribution of sales order volume throughout the day. Following are certain examples

  • There are certain B2B customers or retail stores where sales orders are created throughout the day but interfaced only at mid night or post office hours
  •  

  • Field service engineers create orders in their hand held device throughout the day. But these orders are interfaced only when the handheld device is physically synced in the deport or service station in the evening when the engineers close the day

When the order clock is skewed, there will be a sudden rush of sales orders in one particular point of time in the day, while rest of the day receive normal flow of orders. At these peak volumes, individual order import may not be the best method.

Verdict: At peak loads à Chose batch mode

 

2.3 Seasonal orders: There are certain items which have seasonal demand and hence a sudden inflow of sales orders during a particular time of the week or month or year where individual order import may struggle.

Verdict: At seasonal loads à Chose batch mode

3. Operational clock and throughput:

There are organization with stringent operational clocks. For simple example, it is 30 min delivery or pizza is delivered free. Pharmaceutical retail industry has a need to fulfil orders on the same day. Online furniture selling organisation has the liberty to take a month time to manufacture and ship the item ordered. Care must be taken before sales orders are batched when organizations have such needs.

Verdict: Stringent Operational clock à chose individual mode

Pic 2 - Multiple factors of decision making.jpg








Pic 2 - Multiple factors of decision making

4. Size of the orders:

In hi-tech industry, the sales order can have one high value custom manufactured item while in retail industry, the sales order can contain a laundry list of low value items. If the sales orders with thousands of lines are imported, it may end up queuing next orders for import if the orders are imported one by one.

Verdict Number of lines is less  à chose individual mode

5. Individualization:

In an age where "good old" emails are being replaced with apps, chat and voice services, there is a need for considering the need for individual customer preferences. Customer needs vary from getting and update once a day to getting every single update on their sales order. Hence the choice of the customer needs to be kept in mind while decided if a sales orders can wait for batch process or not. For example, furniture retail organization will have updates sent to the customer once a week or once in few days. Such industry can choose batch process

Verdict: Less updates to customer, frequency of updates to customer is low à Chose batch mode

 

Apart from these parameters, the below 2 subjective parameters also play a key role in the design choice

6. Capability or the ERP:

With organisations wanting to stick to standard functionality rather than customise the application, the capability of ERP chosen becomes one of the first factors to be considered. In current release R13 of Oracle Order Management Cloud (OMC), there is no means to import the sales orders into OMC in batch mode or interfacing the sales order from OMC to EBS in batch mode. This restriction is an example of ERP's capability. The case study is explained in detail towards end of this blog.

Verdict: Pick up the option based on capacity and appetite to customise

7. IT landscape and Architecture:

This is a broad topic and mostly technical in nature, but nevertheless that needs to be factored before the sales order import is decided. It compasses the following details to name a few

  • If the applications involved are on cloud or on-premises or in hybrid model?
  • Are there are multiple firewalls through which the sales orders need to be interfaced for order creation and status update?
  • What is the Middleware used to integrate these applications and robustness of the middle ware?
  • Performance considerations and how sized are the applications and middleware
  • The list of applications the order has to pass through before closure of the order. For the sales order has to pass through order entry application, order orchestration, order fulfillment system (purchasing, manufacturing and warehouse applications), transport application and invoicing application. The longer the chain the difficult it is to made the MACD (modify, add, change, delete) when sales orders are individually imported
  • Technology involved. For example, ODI (Oracle Data Integrator) can be much faster causing batching to be faster when involved from middleware like SOA while individual order creation via order creation web service in Order management cloud is faster when order transformation is minimal

 

Case Study:

The case study belongs to an optical retail chain giant, operating globally, which offers optician services, along with eyeglasses, contact lenses and hearing aids with global turnover of £1.8 billion in 2017. Following are the summary of requirements

  • The sales order volume was expected to be 1.16 million lines per day when all Business units are live in OMC
  • The retails orders are imported from multiple different legacy application into Oracle Order Management Cloud (OMC) for consolidation, orchestration, and routing
  • These updated sales orders are then interfaced from OMC into eBusiness Suite (EBS) OM module for fulfillment
  • Order cycles are short.
    • All pick to order sales orders created before 6 PM will be shipping on same day before 10 PM (35% of overall volume)
    • Sales orders that require manufacturing have life cycle of 7 days (65% of overall volume)
  • Size of the sales order varies from 500 lines (maximum number of lines for Pick, pack, ship orders) to 30 lines (manufacturing SOs)

These requirements are plotted on the below chart


Pic 3 - Business requirements in the case study.jpg

Pic 3 - Business requirements in the case study

The above chart can be used as a tool for deciding. All the dots (requirements) will possibly lie in the batch or individual mode to help the decision.

 

Below are the subjective parameters for the client.

  • Order Management Cloud, as an application, could be scaled up to import the sales order individually despite of the nature of business and volumes explained above. But the interfaced between OMC and EBS could not be managed with the standard connector provided by Oracle. Hence a custom component was suggested. Following diagram explains the standard functionality of the application (individual mode) and custom component (batch mode)

 

        STANDARD FUNCTIONALITY (INDIVIDUAL MODE)                     CUSTOM FUNCTIONALITY (BATCH MODE)

Pic 4 - Standard and custom connector to integrate OMC and EBS.jpg

Pic 4 - Standard and custom connector to integrate OMC and EBS


  • Client had procured SOA 12c for designing and building new interfaces and had controlled approval for customizations

 

Based on the requirements plotted in the above mentioned tool and subjective parameters for the client, both individual and batch model was could not be suggested as the sole design choice. Even though the average number of orders and order lines are well within the parameters to decide individual mode, the outliers in terms of order volume per day demanded a need for batch import. Hence a hybrid option was suggested for the client. The sales order will be imported in individual mode or batch mode based on order type, volume of orders at a point of time, time of the day etc.

Pic 5.gif










Pic 5 - Comparison of different methods

Conclusion:

Batch, individual and hybrid sales order import are equally good options and have their merits and demerits.

  • Batch process can reduce load on the applications and ensure there is a fixed time for orders to be imported and statuses to be sent to the customer. While individual order import will quench the information thirst of a curious customer and ensures there is no harm done to the service level agreement with the customer
  • Batch mode can cause a slight hold of the sales orders and can cause delay in the life cycle. While individual order import may cause too much network traffic and can delay as sales order queue up at times of peak load
  • here are times where there is a need for a parallel universe, i.e. both batch and individual order import are implemented side by side. Cherry picking orders for import may involve higher cost in terms of maintain 2 code versions but certainly guarantees a balance between the 2 modes both functionally and technically


While there is no standard formula, the details given in this blog helps the architects to nail the right approach for each customer and in fact each sales orders in particular.

Robotic Process Automation - Tool Selection Overview

$
0
0

 

Robotic Process Automation - Tool Selection Process


 

Introduction

It is very important for organizations to select the right tool / service provider when they embark on their robotic process automation journey and this blog focuses on this of tool selection approach.

In this blog, we had taken a peek at the 'RPA Tool Selection' approach typically taken by organizations, the typical questionnaire items and how the IT partner can be prepared with these and how it can plan adding a value by strict adherence on these asks.

 

Tool Selection

As a prerequisite to tool selection, organization needs to analyze its business processes and identify the areas which are good candidates for RPA. Identification of these business processes can be done by scoring the specific business process against the following simple questions:

Q. No

Question

Score

1

Is the process or step routine / simple and involve repetitive steps?

1 - Less Likely

2 - Agree

3 - Strongly Agree

2

Is the process or step rule based and doesn't involve ad-hoc or unstructured decision making?

1 - Less Likely

2 - Agree

3 - Strongly Agree

3

Is the data involved in the process or step captured or available in a structured format and fields?

1 - Less Likely

2 - Agree

3 - Strongly Agree

4

Is the data involved in the process or step digital?

1 - Less Likely

2 - Agree

3 - Strongly Agree


Based on the strategy for automation, organizations usually classify the automation eligible business process in 4 quadrants as listed below: 
 
Diagram_Tool_1.jpg

For the identified business process, multiple relevant stakeholders (Business Process Users, Business Process Owner, Functional SMEs, IT) should be involved for evaluation of the RPA Tools to be used.

Below parameters are typically used by organizations to select one of the tools offered by different service providers:


Diagram_Tool_2.jpg


Above mentioned points are elaborated in below chart. Scoring for each option being considered is done by Organizations against these parameters and 'winner' can be identified.


Parameters for Evaluation

Tool Capability

Dashboard & Reporting

UI simplicity, user friendliness and navigation  

Ease to generate standard or ad-hoc reports

Customization ability to modify look and feel of dashboard, charts, etc.

Design & Build Usability

Ease of using for development or change of process automation

Out of the box connectors with common platforms (e.g., Excel, Siebel, SAP)

Robots triggering capability  for executing  automation

Process automation recording capability

Capability to automatically create KT documents like produce process flow diagrams, guide

Capability of tool to identify process improvement opportunities

Available documentation for training & support

Administration,  Management and Allocation of Robots

Sequencing and Workload Distribution & Prioritization capabilities

Centralized and decentralized robots allocation capability

License sharing for usage of robots or dedicated licensing

Impact on IT Architecture

Ease of integration with other Internal and External systems in the organization's IT Landscape

Centralized or decentralized change configuration management capability

Solution adherence towards IT architecture standards

Ease of code migration to TEST, PROD environments and rollback methodology

Scalability of tool to handle higher volume during peak hours and increased volume over the time

 

Access & Security

Role/Group based access and ease of access management

Ability to integrate with existing SSO or Authentication tools like Active Directory, LDAP

Audit Trail maintenance

Compliance & Regulatory Standards

Adherence to regulatory guidelines e.g. SOX compliance

Audit trail and Log creation for audits

Licensing

Robot license costs and license sharing options

Design and Build Cost

Cost and Effort estimates for end-to-end implementation including design, build and test

Change Mgmt. / User Training

Train the Trainer, Change Management activities support i.e. training material availability

Maintenance & Support

Cost and resource support estimates for technical upgrades and maintenance


Conclusion

This tool selection and business process identification activity though as simple it sounds, is one of the key activities involved in the robotic automation process tool identification and laying down the scope. Due diligence is very important at this stages to avoid any scope creep, misalignment with business process and wastage of time and efforts of team involved. 

Mobile First strategy with Oracle CX Mobile App to boost your Sales

$
0
0


In today's competitive world, organizations are emphasizing on building a culture of proactive sales with responsibility of identifying leads, new account openings and agility in closing deals. To enable this, organizations are looking towards recent technology innovations to equip sales team with accurate and anywhere, anytime access to information that would make them more proactive and productive.  

In recent times, most revolutionary change we have seen in technology world is Mobility. This has empowered sales persons quickly have the right information in their hands and enables collaboration with teams while on the move.

In this blog, would discuss about must have Mobile Use stories and capabilities that would imbibe a culture of agile Sales Organization using Oracle CX Mobile App.

 

 


Next Best Action: Upcoming Appointments and Due Tasks

Sales person typically start their day with checking of calendar for appointments and to-do list. An integrated view of appointments and due Tasks for the week would help them plan activities better and effectively.

Oracle CX Mobile App provides sales person calendar view that provides list of daily appointments and tasks with timely reminders. Using this mobile apps, Sales persons can directly capture the meeting notes, Call reports, debriefing and action items which can be immediately shared with the extended teams and synced back Oracle Sales Cloud for further actions.

 

 

Optimize Customer Visits


Sales persons often struggle to plan their customer visits in an optimally way and sometimes they miss to visit the customers because of not having the right details and coordinates of the customer meetings. This will lead to customer dissatisfaction and probably loss of the deal.

With the introduction of View map feature in Oracle CX Mobile App helps Sales Team to view and plan all the customer visits near to the same location in single trip. This feature considers, current location coordinates of sales person and displays the current deals in that vicinity. This will helps them to plan their trips effectively and avoid any back and forth trips to the same location and improves the sales team productivity in customer engagements

 

Manage Deals on the Go

Most of the times Sales person do not show interest to capture information in the application, reasons could be many ranging  from time constraints or cumbersome application to use, no network connectivity  or always on the move. This could lead to potential leak in sales person targets and organizations revenues.

With Oracle CX Mobile App, sales team can manage their Leads, Opportunities and Contacts data easily on the go, its simple and easy to use interface enables ease of entering data online as well as offline.  That allows the sales person to spend more time in client engagements instead of figuring out what to capture in the application.

Also Oracle CX Mobile App does offer voice navigation, works as a virtual sales assistant and interactively guides the sales team through common activities.

 

Collaborative Workforce

Sales person often struggle to get the right information to close the deal and this will lead to slippage of deadlines, moving targets and revenue loss which no sales organizations would ever want.

Oracle CX Mobile App breaks this silo by encouraging team collaboration using in built Oracle Social network tools. This empowers the sales team to collaborate with the various teams in the organization and get required information on time.

Point to be noted here is, present version of CX Mobile App, users would need to install Oracle Social network App for collaboration. We hope in future all of this will be in one app does all.


Analytical Insights

Gone are the days where business analytics and intelligence reports are accessible only in desktops or laptops and meant for Managers.  Being the front face of the organization, Sales person are responsible for mining and identifying the new deals, so they are constantly looking for Analytical tools which can help them access the customer information while on the move, sitting in the meeting room with customers with little or no dependency on the IT teams.  

Introduction of mobile enabled Business Analytics and Intelligence tools, Sales persons now can slice and dice the data based on the customer's historic trends, purchase patterns, products interested. With these Analytic tools at their disposal they can create a new up-sell opportunities on the fly and have the flexibility to offer something new to customers while on the move.

Oracle CX Cloud Mobile supports in build Analytics as part of the Oracle CX Mobile app which can help the Sales team to access the information and present to right forums at right times. After all, when it comes to closing the deals, the more reliable information sales team have, the better they will be able to achieve organization objectives/Close the deals.


       

Note: Figures courtesy Oracle CX Cloud documentation

Extensibility of Mobile UI

Tailoring the UI to meet various customer needs, CX Mobile App offers simple capability to change the UI to meet varying customer needs. 

Having said that Simplified UI or web UI remains the primary UI and Mobile is alternate UI, while designing the Mobile apps we need to keep in mind the form factor of mobile device and also carefully plan the priorities for sales team while they are on move.

Certain Short falls in existing Mobile App., which would like Oracle to incorporate in future releases

          • Integration with Instant messengers likes WhatsApp, WeChat or Telegram to foster Internal Sales Collaboration for quick responses

          • Integrated Business Card Scanning App- this would encourage contact sales rep to directly transfer contact data into Sales Cloud application.

          • Comprehensive Voice Chat Bot with Natural Language Support (AI), which would enable sales reps to record transactions on the go from creating Call Reports to updating activities.


          Conclusion:

          By choosing Mobile First strategy using Oracle CX Mobile app, organizations can reap the benefits by improving the sales person productivity, reduce costs, automate manual activities and increase the organization revenues with new deals and customers. After all, this is for the betterment of the organization who dares to say no. So get ready to embrace the Mobile First Strategy....




          Optimising Omni-Channel order Interface - Real time or Near Real time?

          $
          0
          0

          In a world where the attention span of customers is fast reducing, the longer the response time the greater the probability of losing the customer. The customer expects their order to be updated at appropriate time and wants the know the status in minutes at least, if not seconds. In this context, organisations handling sales order in an Omni-channel order processing, face a dilemma on whether to wait and process the sales orders in batch mode to optimise the load on applications or process the orders immediately as individual orders to service the customers faster? This blog tries to unearth this mystery by searching for various factors that might affect this decision.

           

          Pic 1 - Real time Vs near real time.jpg

                                                                                                                                    Pic 1 - Batch or Individual mode

          What's what:

          In a true Omni channel scenario, sales orders are created via various means like ecommerce web site, an app in mobile or tab, traditional order entry means, customer service team order entry, orders entered in retail stores, orders via EDI, orders created in kiosk at shopping malls etc. While order capture systems are many, there is traditionally one fulfilment application (For example, Oracle Fusion cloud). There are 2 ways of interfacing, enriching, importing and processing these orders in the fulfilment application. Let us look at a theoretical definition of the options available

          • Near real time or Batch mode: Interface all sales orders from various applications into one single repository. At a specified time enrich, import and process the orders as one batch by running batch processes. In order to improve the performance, static or dynamic batch (batched as per order type or order source) is used
          • Real time or Individual mode: Interface, enrich, import and process every single sales order from various applications individually i.e. order by order. In order to improve the performance, multi-threading is used

           

          Following are few of the key considerations before the Sales order interface architecture is decided

          1. Order cycle:

          The order life cycle can range from weeks (for example in hi-tech industry) to just a few hours (typically in retail industry). But irrespective of the industry, with the advent of digitisation, the life cycle of orders is fast decreasing. The length of the order cycle needs to be considered before the selecting the mode of order import.

          Verdict: Short life cycle à Import individual sales order in real time

          2. Order volume, order clock and seasonal:

          2.1 Volume: In retail industry, the volume of sales orders is quite high. For example, there are retail customers creating 10,000 to 4 Mil sales orders per day. When the volume is high, it is a general trend to choose to batch the sales orders and import them at regular frequency during the day. This will optimise the utilisation of server resources.

          Verdict: High volume à Chose batch mode in near real time


          2.2 Order clock:
          Order clock is the distribution of sales order volume throughout the day. Following are certain examples

          • There are certain B2B customers or retail stores where sales orders are created throughout the day but interfaced only at mid night or post office hours
          •  

          • Field service engineers create orders in their hand held device throughout the day. But these orders are interfaced only when the handheld device is physically synced in the deport or service station in the evening when the engineers close the day

          When the order clock is skewed, there will be a sudden rush of sales orders in one particular point of time in the day, while rest of the day receive normal flow of orders. At these peak volumes, individual order import may not be the best method.

          Verdict: At peak loads à Chose batch mode in near real time

           

          2.3 Seasonal orders: There are certain items which have seasonal demand and hence a sudden inflow of sales orders during a particular time of the week or month or year where individual order import may struggle.

          Verdict: At seasonal loads à Chose batch mode in near real time

          3. Operational clock and throughput:

          There are organisation with stringent operational clocks. For simple example, it is 30 min delivery or pizza is delivered free. Pharmaceutical retail industry has a need to fulfil orders on the same day. Online furniture selling organisation has the liberty to take a month time to manufacture and ship the item ordered. Care must be taken before sales orders are batched when organisations have such needs.

          Verdict: Stringent Operational clock à chose individual mode in real time

          Pic 2 - Multiple factors of decision making.jpg








          Pic 2 - Multiple factors of decision making

          4. Size of the orders:

          In hi-tech industry, the sales order can have one high value custom manufactured item while in retail industry, the sales order can contain a laundry list of low value items. If the sales orders with thousands of lines are imported, it may end up queuing next orders for import if the orders are imported one by one.

          Verdict Number of lines is less  à chose individual mode in real time

          5. Individualisation:

          In an age where "good old" emails are being replaced with apps, chat and voice services, there is a need for considering the need for individual customer preferences. Customer needs vary from getting and update once a day to getting every single update on their sales order. Hence the choice of the customer needs to be kept in mind while decided if a sales orders can wait for batch process or not. For example, furniture retail organisation will have updates sent to the customer once a week or once in few days. Such industry can choose batch process

          Verdict: Less updates to customer, frequency of updates to customer is low à Chose batch mode in near real time

           

          Apart from these parameters, the below 2 subjective parameters also play a key role in the design choice

          6. Capability or the ERP:

          With organisations wanting to stick to standard functionality rather than customise the application, the capability of ERP chosen becomes one of the first factors to be considered. In current release R13 of Oracle Order Management Cloud (OMC), there is no means to import the sales orders into OMC in batch mode or interfacing the sales order from OMC to EBS in batch mode. This restriction is an example of ERP's capability. The case study is explained in detail towards end of this blog.

          Verdict: Pick up the option based on capacity and appetite to customise

          7. IT landscape and Architecture:

          This is a broad topic and mostly technical in nature, but nevertheless that needs to be factored before the sales order import is decided. It compasses the following details to name a few

          • If the applications involved are on cloud or on-premises or in hybrid model?
          • Are there are multiple firewalls through which the sales orders need to be interfaced for order creation and status update?
          • What is the Middle-ware used to integrate these applications and robustness of the middle ware?
          • Performance considerations and how sized are the applications and middle-ware
          • The list of applications the order has to pass through before closure of the order. For the sales order has to pass through order entry application, order orchestration, order fulfilment system (purchasing, manufacturing and warehouse applications), transport application and invoicing application. The longer the chain the difficult it is to made the MACD (modify, add, change, delete) when sales orders are individually imported
          • Technology involved. For example, ODI (Oracle Data Integrator) can be much faster causing batching to be faster when involved from middle-ware like SOA while individual order creation via order creation web service in Order management cloud is faster when order transformation is minimal

           

          Case Study:

          The case study belongs to an optical retail chain giant, operating globally, which offers optician services, along with eyeglasses, contact lenses and hearing aids with global turnover of £1.8 billion in 2017. Following are the summary of requirements

          • The sales order volume was expected to be 1.16 million lines per day when all Business units are live in OMC
          • The retails orders are imported from multiple different legacy application into Oracle Order Management Cloud (OMC) for consolidation, orchestration, and routing
          • These updated sales orders are then interfaced from OMC into eBusiness Suite (EBS) OM module for fulfilment
          • Order cycles are short.
            • All pick to order sales orders created before 6 PM will be shipping on same day before 10 PM (35% of overall volume)
            • Sales orders that require manufacturing have life cycle of 7 days (65% of overall volume)
          • Size of the sales order varies from 500 lines (maximum number of lines for Pick, pack, ship orders) to 30 lines (manufacturing SOs)

          These requirements are plotted on the below chart


          Pic 3 - Business requirements in the case study.jpg

          Pic 3 - Business requirements in the case study

          The above chart can be used as a tool for deciding. All the dots (requirements) will possibly lie in the real time / batch or real time / individual mode to help the decision.

           

          Below are the subjective parameters for the client.

          • Order Management Cloud, as an application, could be scaled up to import the sales order individually despite of the nature of business and volumes explained above. But the interfaced between OMC and EBS could not be managed with the standard connector provided by Oracle. Hence a custom component was suggested. Following diagram explains the standard functionality of the application (real time or individual mode) and custom component (near real time or batch mode)

           

                  STANDARD FUNCTIONALITY (INDIVIDUAL MODE)                     CUSTOM FUNCTIONALITY (BATCH MODE)

          Pic 4 - Standard and custom connector to integrate OMC and EBS.jpg

          Pic 4 - Standard and custom connector to integrate OMC and EBS


          • Client had procured SOA 12c for designing and building new interfaces and had controlled approval for customisation

           

          Based on the requirements plotted in the above mentioned tool and subjective parameters for the client, both individual and batch model was could not be suggested as the sole design choice. Even though the average number of orders and order lines are well within the parameters to decide individual mode, the outliers in terms of order volume per day demanded a need for batch import. Hence a hybrid option was suggested for the client. The sales order will be imported in individual mode or batch mode based on order type, volume of orders at a point of time, time of the day etc.

          Pic 5.gif










          Pic 5 - Comparison of different methods

          Conclusion:

          Near real time or Batch, real time or individual and hybrid sales order import are equally good options and have their merits and demerits.

          • Near real time / Batch process can reduce load on the applications and ensure there is a fixed time for orders to be imported and statuses to be sent to the customer. While Real time / individual order import will quench the information thirst of a curious customer and ensures there is no harm done to the service level agreement with the customer
          • Near real time / Batch mode can cause a slight hold of the sales orders and can cause delay in the life cycle. While real time / individual order import may cause too much network traffic and can delay as sales order queue up at times of peak load
          • here are times where there is a need for a parallel universe, i.e. both batch and individual order import are implemented side by side. Cherry picking orders for import may involve higher cost in terms of maintain 2 code versions but certainly guarantees a balance between the 2 modes both functionally and technically


          While there is no standard formula, the details given in this blog helps the architects to nail the right approach for each customer and in fact each sales orders in particular.

          Viewing all 561 articles
          Browse latest View live




          Latest Images