The following article is specific to a tool from IBM known as Enterprise Generation Language. I provide the information not so much as a solution specific to EGL but rather as a model on tenets I believe are critical to effective source and configuration management for z/OS systems, the main one being “ultimately, the TRUE source (not just the generated source or derived source) needs to be captured for auditing and management purposes”. It’s not good enough for “models” on distributed platforms to be “the source” and then import what it creates in its completeness as an application; I believe to truly safeguard an application on z/OS, I must be able to recreate that application from the “stuff” I’ve stored… and my place of storage for applications is Endevor.
In the past, I was asked to investigate the options in integrating Enterprise Generation Language (EGL) for z/OS from IBM into Endevor. What are the choices that an Endevor site has in securing applications so that the same integrity Endevor gives to “native” code can be secured in “generated” code?
Based on my research, I have been able to determine the following:
Findings:
- Unlike other CASE tools that generate code for execution on the z/OS system, the EGL ecosystem requires the target language to be generated on the workstation. Other CASE tools (such as CA GEN) provide the option of generating the code on z/OS.
- One of the “choices” during COBOL code generation is to have the code automatically delivered, compiled, and otherwise made ready on z/OS from the Enterprise Developer on the workbench.
Note in this flow that at one point you can specify PREP=Y. This instruction on the workstation causes the generated COBOL, JCL, and if necessary, BIND statements to be transferred to the mainframe for execution. Otherwise, all built routines remain on the workstation for delivery to the z/OS platform based on how you want to send it there.
- All sites contacted or from whom I have been able to get information have indicated that they are storing their EGL source in a distributed solution (either Clearcase or Harvest) and are storing the z/OS source in Endevor. The mechanism for storing the generated source in Endevor (i.e. manual or automatic) has not been determined.
- Given the fact that sites ARE saving something that is referenced as EGL source and storing it in their distributed solution, this gives evidence (as well as reference in the EGL manuals) that there IS EGL source that needs to be stored.
Unknowns:
- Is there a name or label or title or something in the EGL source that correlates to the generated z/OS elements? This is key to providing a quasi-automatic solution.
Design Options:
- EGL in Distributed Solution/Manual delivery of z/OS components.
This option appears to be the most prevalent amongst those sites that are using EGL. Note that one of the other indicators from my research is the lack of sites using or implementing EGL at this time. While this may change in the future, there is limited experience or “current designs” to draw upon. This solution would, as the title implies, store the EGL in a distributed SCM solution, do the generation on the workstation, FTP or otherwise transmit the generated source to the mainframe, and then ADD/UPDATE the source into Endevor for the compilation.
Note that the transmission of the source generated on the workstation and the ADD/UPDATE of the source into Endevor can be accomplished today without signing onto the mainframe by accomplishing this step through Change Manager Enterprise Workbench (CMEW).
- EGL in Distributed Solution/Automatic Delivery of z/OS components
In this scenario, the EGL would still be stored in the distributed SCM solution. However, if you specified PREP=Y, then the source would automatically be delivered and compiled by and in Endevor.
This scenario would require research and modification of the IBM provided z/OS Build Server. Based on the research conducted to-date, the z/OS Build Server is a started task that invokes the site-specific compile, link and bind processes. This process could, theoretically, be modified to instead execute an Endevor ADD/UPDATE action that would result in the source automatically being stored and compiled/linked/bound by Endevor instead of the “default” process provided by IBM.
- EGL in Endevor Complete
In this scenario, the z/OS generated components remain on the workstation. All components, including the EGL source, COBOL source, link statements and anything else created by EGL, are then merged into a single “element” with each different type of source perhaps being identified with a separator line of some sort (maybe a string of “***********”). The ADD/UPDATE process of Endevor would then execute the different source components through their appropriate compile/link/bind programs as appropriate i.e. the first step in the processor would create temporary files that unbundled the different source types. These temporary files would then be the source that is generated.
Note: In order for any of the following designs to work, discovery of the previously documented “unknown” must be determined. These designs will only work if there is “something” in the EGL source that can be directly tied to generated z/OS components.
- EGL in Endevor / EGL Delivery to z/OS
In this scenario, code generation would take place on the workstation and PREP=Y would execute as provided by IBM with no modifications (other than site-specific ones) to IBM’s z/OS Build Server. This will result in the COBOL, link, and BIND source being delivered to PDS on the mainframe and compiled there.
Assuming the delivery of the components to z/OS can be done to “protected” libraries, the EGL source could then be ADD/UPDATEd into Endevor using CMEW. The ADD/UPDATE process would then query the EGL source and automatically copy or otherwise bring in the COBOL, Link, and Bind source created and delivered earlier. The load modules created would be ignored; they would be recompiled again under Endevor’s control.
There are a variety of other options and designs and hybrids/combinations on the above ideas that I can think of. However, this paper should serve as the beginning of a discussion concerning which model or architecture best suits the needs of the site.