Tuesday, June 14, 2016

BIP: Distinct

 

cleartext.blogspot.com

BIP has a distinct function

xdoxslt:distinct_values
If you use it on a node of elements, it returns a space separated sequence of its distinct elements.
Eg: for this xml:

cleartext.blogspot.com

<ROWSET>
    <ROW>
        <CwaProductCode>001</CwaProductCode>
    </ROW>
    <ROW>
        <CwaProductCode>002</CwaProductCode>
    </ROW>
    <ROW>
        <CwaProductCode>001</CwaProductCode>
    </ROW>
    <ROW>
        <CwaProductCode>003</CwaProductCode>
    </ROW>
</ROWSET>

cleartext.blogspot.com

Using <?xdoxslt:distinct_values(CwaProductCode)?> gives:

001 002 003

Using <?count(xdoxslt:distinct_values(CwaProductCode))?> gives:

3

But what if one needs to group by the distinct values, and then count the number of elements under it ? Then distinct can’t help.

cleartext.blogspot.com

<?for-each-group:ROW;./CwaProductCode?>
<?CwaProductCode?><?'-'?><?count(current-group()/.)?>
<?end for-each-group?>
Gives the answer:

001-2
002-1
003-1

cleartext.blogspot.com

cleartext.blogspot.com

Wednesday, June 1, 2016

What caused the NA14 outage ?

 

There is a very nice indept article here explaining what happened during the SFDC outage on May 9th, 2016, when server NA14 went down. It also discusses the aftermath and lessons learnt from the outage.

image

image

Friday, May 20, 2016

HCM: Printing Hours and Rates on Checkwriter

 

Fusion HCM Cloud uses BIP templates as part of it payroll process, its also used for printing checks. One of the most common requirements I have received is to print the employees hours and rates on the checkprinter output.

Here is what the vanilla template looks like, its called USCheckWriterReport.rtf.

cleartext.blogspot.com

image

It can be seen that the earnings section in the bottom prints only current and YTD amounts. The hours and rates are not printed by default.

The hours and rates are in a different node in the Payslip xml 

cleartext.blogspot.com

This is the node for earnings:

image

And this is the one for Hours and Rates

image

So what we need to do is the use the REPORTING_NAME as a foreign key, and lookup the correct HOURS_X_RATE record, and then print it in the earnings table.

cleartext.blogspot.com

This is the code to use:

<?$ava_earnings/GLB_PAY_ARCH_PR_HOURS_X_RATE[REPORTING_NAME=current()/REPORTING_NAME]/EARNINGS_HOURS?>

 

PS: I am just putting this up for my own reference, and for anyone out there stuck like me.

cleartext.blogspot.com

Wednesday, May 11, 2016

Major SFDC Outage

 

Today there is a hackernews thread describing a major SFDC Outage.

 

image

image

This is the side effect of the cloud computing. An outage affects multiple businesses.

 

image

image

image

image

image

 

image

Friday, April 15, 2016

Fusion HCM: Testing Payroll templates

 

Oracle’s Fusion HCM cloud service uses BIP templates as part of its pay processing. When the Payoll processes are submitted, HCM does a lot of things, it generates the Pay data, and populates tables. But the final step of this process is to generate the actual payslip , and for this it invokes a BIP report. © 2016 cleartext.blogspot.com

As report designer, I often get requests to modify these reports. But to test the reports, one need not run the entire payroll process again. Simply re-running the report with the modified template will suffice. © 2016 cleartext.blogspot.com

The BIP template to modify for US is the USOnlinePayslip.

image

Its location is:  image

To see when it was last run, simply check the history of this report, under More-> History. Reset the query and run search.

image

 

Pick up any of the last records, and drilldown  on it. Now copy the Payroll Action Identifier from this page © 2016 cleartext.blogspot.com

image

Now you can go back and re-run this report with this parameter, and can see the changes of your template instantly ! Ta-da ! © 2016 cleartext.blogspot.com

Friday, April 8, 2016

Oracle Sales Cloud: Getting around OSC’s WSDL parsing , Siebel UCM

 

Recently working on getting Oracle Sales Cloud integrated to Siebel UCM for Accounts. Oracle Sales Cloud can read a WSDL specification from a URL, but you still have to build the request message using Groovy script (would be nice to shoot whoever designed this). So you have to use Groovy script and write a global function to build up the message. Turns out, you can only add elements and attributes to the message which is already in the parsed WSDL. Nothing new can be added. © 2016 cleartext.blogspot.com

The problem here is that for integrations to Siebel UCM, a hidden attribute named ExternalSystemId has to be populated in the incoming message. This attribute is not in the WSDL when it gets generated from Siebel UCM. But it has to be sent in the SOAP request (would also be nice to shoot whoever designed this).  © 2016 cleartext.blogspot.com

Error invoking service 'UCM Transaction Manager', method 'SOAPExecute' at step 'Transaction Manager'.(SBL-BPR-00162)
--
<?> Failed to find ExternalSystemId in input message(SBL-IAI-00436)

 

This is what you get when you consume the UCM WSDL in SOAP UI.

image

The actual message has to be (see highlighted changes) © 2016 cleartext.blogspot.com

image

 

If you add the groovy script to add this attribute, OSC will simply ignore it, and the attribute is not send to UCM. The only viable workaround is to Edit the WSDL AFTER it is generated, but BEFORE it is given to Sales Cloud !  © 2016 cleartext.blogspot.com

1: Generate the WSDL from UCM.

2: Open it in an XML editor , use XMLSPY if you have it. © 2016 cleartext.blogspot.com

3: Find the definition of the top container element int the WSDL: © 2016 cleartext.blogspot.com

image

4: Add this text (highlighted) : © 2016 cleartext.blogspot.com

image

<xsd:attribute name="ExternalSystemId" type="xsd:string"/>

5: Now validate, save and upload this WSDL to your public folder from where OSC can read it. OSC does not consume WSDLs, but reads the definition on the fly. © 2016 cleartext.blogspot.com

6: Now add the groovy script to populate this new attribute with the registered SystemId Name.

 

Phew !!

image

© 2016 cleartext.blogspot.com© 2016 cleartext.blogspot.com

Thursday, March 31, 2016

TCC: Handling Encoding

 

Aahh…multilingual. That word increases the complexity of any project intantly. When your enterprise application is multilingual, it means users will be able to add and edit data in different languages, and that the data can no longer be stored in ASII/ANSI format. East Asian, and Middle Eastern and some European languages require more than one byte for a single charachter, so it has to be stored in Unicode format. While working on Taleo’s TCC scripts, I recently hit a roadblock with multilingual data, but the fantastic folks at Taleo had already solved the problem. © cleartext.blogspot.com 2016

TCC’s pipelines handle data in UTF-8 format, but many enterprise systems will produce output in UTF-16. So which one is better ? There is a common misconception that UTF-8 can not store all language charachters, and that UTF-16 is required. That’s not true. UTF-8 has an amazing awesome format, and it can depict every Unicode charachter, same as UTF-16. Here is a spectucular explanation of this miracle.

So if you get data in UTF-16 (or any of the other formats), how do you load them via TCC ?  In the configuration  file, TCC provides an option to set the encoding of the source file. © cleartext.blogspot.com 2016

image

If you know the encoding of the source, you enter it here.  This will work for standard Import files. There is also an option to set the response encoding.

image

But some complicated TCC configurations, like NetChange, require the pipeline data to be in UTF-8. So how do you get around that ?

The encoding can be changed in the configuration file. Go to Pre-Processing tab, and add a new step.

There is an option to add an ‘Convert Encoding’ step. © cleartext.blogspot.com 2016

image

Choose the source encoding, and set the target as UTF-8. This step has to be the first step in your NetChange configuration file.

image

Thats it ! cleartext.blogspot.com