compass on map

The Ragged Edge of CPI-DS

Tricks to developing, surviving, and conquering

Well, it sure isn’t SQL… Anyone that has worked in the integration space is probably familiar with SQL Developer, Oracle, MySQL, and for the SAP Purists, ABAP forms and the functionality and limitations of those systems for manipulating data. In the brave new world of IBP, we now have to learn a new system: Cloud Platform Integration for Data Services (CPI-DS), aka Hana Cloud Integration for Data Services (HCI-DS), aka Hana Cloud Integration (HCI).

Straight out of the gate we are presented with what appears to be a relatively straightforward drag and drop environment for creating interfaces from a variety of source systems to then feed the beast that is SAP Integrated Business Planning. Simple 1:1 mappings that require little transformation are “fairly” straightforward, particularly with small master data tables like Resource or Mode of Transportation. However, once you get into more complicated business scenarios and transformations, many power SQL users will quickly find that the system can be quite challenging to navigate and more importantly does not allow for some of the wonderful constructs we know and love: Union Queries, Looping Structures, etc. or does it?

In this post, I’m going to take you through a mind-bending voyage of space and time to give you some alternative approaches to working within CPI-DS to conquer even the most challenging business requests all while keeping the interfaces concise and functionally correct.

Auto-Mapping & Field Nomenclature!

Now, this is likely the point where many of you are considering YouTubing cats on trampolines, STOP! This is boring, but critical. In classic database query editors, we are used to running queries, adjusting, rinse and repeat. Also, we can have a base of code to which to refer back to. Now that you only have the GUI editor, procedure becomes incredibly important.

  • Name your transforms according to what operation step you are performing, i.e. if your pulling data from the MARC table, try “MARC_Extract” or if you are aggregating up the dates to a technical week level “Techweek_Agg”
  • For the lazy people (myself included), this step requires a lot of discipline but will pay dividends later on: Rename the fields to what they are going to be. In the example below, I am using the MARC.WERKS field to populate Location ID. So, I rename the field in the editor to LOCID, that way I know where it needs to go in subsequent steps. Most importantly, when I get to the final step of mapping into the “Target Query” that populates the IBP tables, I can now use the “Auto-Mapping” functionality so that I don’t have to scroll endlessly through all the available fields; anyone who has mapped into a UNIPA Key Figure table knows what I’m referring to.
  • Use the description field: In my example, I have a description that includes the full naming convention for IBP followed by the technical field. This allows me to comprehend the incoming information in subsequent transforms. Also, it gives the next developer some idea into what I am trying to accomplish when I’m gone.
CPI-DS platform 1

Task vs. Dataflow Structuring

The ability to control CPI-DS task via Application Job Templates has created a whole host of possibilities to allow greater control and enhanced functionality within IBP. However, there is a notable drawback: I can only execute at the task level. Now I am faced with a sticky wicket with respect to having multiple dataflows grouped into a single task. Since I can only execute a task from the application job template, what happens when I have multiple dataflows grouped into a single task? Answer, they all get executed. This may create difficulty in managing certain scenarios, but there are solutions to the problem: Global Variables. In the following example, I have a grouping of dataflows inside the Simple Master Data task.

CPI-DS platform 2

 

  • For each dataflow within the task, create a dataflow specific global variable, i.e. $G_PRODUCT_FLOW_IO (Type Integer)
  • Add a filter condition to that dataflow in one of your transforms as such: $G_PRODUCT_FLOW_IO = 1
  • Set the default value to 1. In the description of that variable something along the lines of “Setting this global variable to 0 will disable the Product Master Data Flow”.
  • Now at the time of execution, if that value is set to 0, or anything other than 1, all values are filtered out resulting in a blank dataset.
  • A classic use case would be needing to run a task in replace mode, but only wanting to replace a single table rather than the 5 tables represented in the dataflows within this task. By passing 0 records for one or more dataflows, those tables are unaffected by that operation as it will only replace data when you pass 1 or more records over.
  • The most intriguing use for this is within ABAP transforms. The resulting ABAP code generated creates an “ENDIF” function at the beginning of the form that effectively stops the query from ever starting, resulting in minimal to no stress on the ECC / S4 system.

Union Queries (yes you read that correctly)

Every developer will recognize the need for Union queries. Especially in supply network structures, there are invariably times when queries need to be handled differently for certain business scenarios. Even more troublesome is if you need to run two tasks to accomplish this, how do you handle replace scenarios? Remember to run the first job in Replace and Insert/Update for the second? What a hassle! If only I could run a union query and combine the data within the dataflow, but there isn’t native functionality built to handle such queries in CPI-DS! Or is there?

In order to create a more perfect union, there are two approaches: one utilizes standard sources without an ABAP transform while the other can be done within the ABAP layer.

Non-ABAP Union

  • Start with the two sources or sources followed by data transforms and to each add a single integer field mapped with a value of 1 or a 2 for 2 part unions. For more add additional integers as you need a distinct value for each. This would be a great time to drop in a description…. I like to call that field “JOIN_FIELD_1” or “JOIN_FIELD_2”
  • Add a Row Generation object to the query with the number of rows equal to the number of tables or transforms you need to union together.
  • Then link all of the above into a single transform and perform a left out join with the Row Generation object as your left table and the transforms or sources on the right.
  • Finally, bring all the fields from each query over into the target fields section, you will have duplicates, but the system will add suffixes to keep you straight. Also, bring over the row generation field over into the target query.
  • In the next transform, concatenate all the similar fields together, for numerical fields they can be added together as you will be adding null values to the true values.
  • Note, all of the fields will now perceive that the length of string fields has doubled, tripled, etc. Use the cast function to eliminate this warning.

ABAP Union (The Mount Everest of Hacks)

!Disclaimer: This may cause headaches in the faint of heart and cause DBA’s to cringe at the very sight of it. Do not leave in reach of small children. This query has been known by the State of California to cause cancer.
  • Start with the two sources or sources followed by data transforms and to each add a single integer field mapped with a value of 1 or a 2 for 2 part unions. For more add additional integers as you need a distinct value for each. This would be a great time to drop in a description…. I like to call that field “JOIN_FIELD_1” or “JOIN_FIELD_2”
  • Within the ABAP layer a row generation will no longer be possible. In order to achieve the same effect, take an existing table that will never change, in my case I used the T006A Table containing descriptions of Unit of Measures. Filter for a 2-3 id fields that you know exist, i.e. EA. Then, within the mapping of that field, insert an “ifthenelse” statement (nested several “ifthenelse” statements when you need more than two values) to result in an output of a single column with distinct integer values.
  • Then link all of the above into a single transform and perform a left out join with the Row Generation object as your left table and the transforms or sources on the right.
  • Finally, bring all the fields from each query over into the target fields section, you will have duplicates, but the system will add suffixes to keep you straight. Also, bring over the row generation field over into the target query.
  • In the next transform, concatenate all the similar fields together, for numerical fields they can be added together as you will be adding null values to the true values.
  • Note, all of the fields will now perceive that the length of string fields has doubled, tripled, etc. Use the cast function to eliminate this warning.
  • The following illustrates a live example where this technique was used twice in order to create union production resources, storage resources, and transportation resources, all of which required completely different logical operations, which were then unioned together to create the full dataset.

If your curiosity has brought you this far, you may be asking why I would go through all that trouble for a single transform. If you have gotten this far, I am assuming you know the answer. ABAP resources are not always readily available, sometimes data is dirty, and sometimes you need to get the job done with the tools you have available. When your faced with those scenarios, having a few aces in your back pocket can be the difference between on time and on budget and the unforgiving sound of the door closing on your way out the door.

Custom ABAP Queries

The custom ABAP query opens many doors that were previously closed when it comes to more flexible development in terms of ECC / S4 systems. Now, an experienced ABAP developer can write more complex forms that were profoundly difficult if not impossible to create using standard CPI-DS functionality. It also opened up some other doors for creating shortcuts when needing to change destination systems without re-creating entire flows.

The case example would be the query presented in the previous section. Such a query is quite complex and the developer preference would always be to copy to new target tables but within the same target system. A situation arose with such a query where an OSS note response from SAP recommended mapping the query to a flat file directory on the DS Agent server. Since it would be necessary to alter the target system, that would require recreating the entirety of the query (several hours worth of work) from scratch. In order to get around that we were able to use the custom ABAP function to recreate the full query with minimal effort.

  • The first step is to copy the name of the ABAP program and job name from the original query.
  • Next, a new task is built (with identical global variables & execution properties) referencing the same source system but with a filestore target system.
  • After building the task, within a new dataflow, a custom ABAP query is brought into the designer and renamed to the same name as the last transform / aggregation query in the original ABAP query. In the example query, that would be “Combine_2.”
  • All of the fields and datatypes should match those contained within the original “Combine_2” query.
  • There is a button labeled ABAP Form Editor; this can be left blank in the case that the program is already in the system. When the task is executed, the program will then be executed pre-loaded.
  • Any global variables executed within the transform should be mapped in the parameter section. The name of those parameters can be arbitrarily named as long as they reference the “$G” parameters built into the flow.
  • Finally, all the query steps proceeding the ABAP Endpoint need to be replicated up to the point of the target query but mapped to identical fields defined in the filestore target.

Wrap-Up

While some of these techniques are certainly on the fringe of what CPI-DS was meant to do, when applied judiciously many of them can be lifesavers when business rules throw a wrench in your transformations. Others are just good to know about when the off color requests come through from the business, SAP, or any other parties interested in severely raining on your otherwise happy development days. Stay tuned to SCM Connections: Get Smart for other great content related to SAP Integrated Business Planning.

Share the knowledge

Share on linkedin
Share on twitter
Share on facebook
Share on email

Fresh supply chain knowledge delivered straight to your inbox

By signing up you agree to receive our quarterly newsletter